SharePoint – including SharePoint Server and SharePoint Online as a part of Office 365 – is the most widely-deployed content services platform. In fact, 85% of the Fortune 500 is using Office 365. This creates enormous opportunities to learn from a thriving community of people in commercial and community events.
When looking at a clock, it is easy to become mesmerized by the gears turning. When marveling at the precision and beauty of the meshing of gears, it can be easy to overlook the box that the gears are in. Yes, the gears drive the hands of a clock, but they can’t do it without the structure provided by their case. The learning manager provides the structure for a learning delivery team. When things are running smoothly, no one really notices the value he or she brings.
What Is a Learning Manager?
One part firefighter and one part strategist, the learning manager steps in when needed and works to see that stepping in isn’t needed. The role of learning managers is first and foremost to keep the learning creation and delivery engine running. Once the engine is running, they can focus on optimization, including ensuring that the engine will continue to run by offering development opportunities for their team and implementing technologies that will make the process easier.
The course has been created and delivered. Now is the time to assess what did and didn’t work in the process through the lens of learner outcomes, which is the ultimate measure of success or failure. How does the monitoring specialist report on what happened?
What Is a Monitoring Specialist?
We live in a world of analytics, where we try to capture data from users and draw meaningful results out of the seemingly random noise in the signals that we have. The monitoring specialist sits in the center of the data analysis tools and log files, providing meaningful information to the stakeholders, who look to see what worked – and what didn’t work – in the training.
What Is Expected of the Monitoring Specialist?
Monitoring specialists are, first and foremost, data people. They look at the reporting coming out of the system and convert it into insights about instructors, courses and learners. While they may be called upon to help develop a new survey instrument or connect a survey tool to the learning management platform, their core competencies are around extracting meaning out of the data.
Throughout the training development process, the goal is the development and implementation of a course that facilitates learning. Since that process is built on humans, we know that it can never be perfect. It’s the role of the quality control coordinator to shepherd the course through the process and ensure the highest practical levels of quality.
What Is a Quality Control Coordinator?
During the development of the course materials, the quality control coordinator ensures that proper reviews are happening to maintain the accuracy of the materials, and they work with the authors to ensure that their message is as easy as possible for students to learn.
This role is challenging because, unlike manufacturing, where there’s a predictable failure model, it’s possible for authors to be strong in one area of the content and weak in another area. That means they must be vigilant against the possibility that there will be weaknesses in the course.
The dull murmur of instructors and students casually chatting before a class begins has been replaced by the hum of server fans and air conditioning in computer rooms. The instructor standing in front of a class has been replaced by the flow of packets from faraway servers to the student’s computer. It’s the distribution specialists who keep these connections flowing and the servers humming along.
What Is a Distribution Specialist?
Distribution specialists are the professionals who keep the learning platform running so that students can access the materials. While this role is dramatically different than a frontline, in-the-trenches role, it has the important goal of distributing content. Distribution specialists have a radically different skill set than instructors. Where instructors are skilled in instruction and facilitation, distribution specialists may not be comfortable when placed in front of a class.
If a tree falls in the woods and no one hears it, did it really make a sound? This question is at the heart of the need for people who help training reach students. It’s only by helping students through the course that it has had any impact or value. There’s no good in a course that sits on the shelves, never to be used. Distribution staff, of which instructors are a part, are the bridge from the completed training to the impactful implementation.
The instructor is probably the most recognizable part of an instructor-led training process; it’s in the name. The instructor is the powerful person who takes the development work and helps it reach the students.
What Is an Instructor?
For instructor-led training, the instructors are the front-line workers who are in the trenches every day helping students learn. Even in computer-based training where live assistance is needed, they’re supporting actors who may not hold the lead role but are nonetheless essential to the delivery of the content.
The phrase most likely to describe the author in the training and development process is “and then the magic happens.” The author is at the core of the content development process. He or she takes the input from the SMEs and the coaching from the learning designer and makes it happen.
What Is an Author?
The author of a course creates the bulk of the content and works with SMEs and learning designers to develop the most effective ways to teach it. He or she may be adept at creating instructor-led materials, computer-based training, productivity aids or supporting materials. The author will have his or her fingers on the keys pounding out the prose that students will absorb in the form of learning.
What Is Expected of the Author?
Authors are expected to have a basic command of their chosen tools. Certainly, a word processor and a presentation program top the list of tools in which they’ll need to be proficient. They may also be skilled in one or more of the content authoring tools and design programs necessary for creating visuals or productivity aids to appropriately communicate the material.
When SharePoint first came out in 2001, development for the platform wasn’t easy. It was ASP—not ASP.NET, which was the first development approach for SharePoint. In 2003, the platform was migrated to .NET, but it wasn’t until 2007 that it had a proper customization strategy in the form of features and solutions. The world has changed since then, and SharePoint has had several development models come—and one has both come and gone. In this article, we’ll look at the development models available in SharePoint and Office 365 development and explain why one would choose one model versus another.
Introducing the Development Options
The last four years in SharePoint have been tumultuous, to say the least. Of the five models available, three were introduced in the last four years. These are the five models:
- Server-Side Object Model (a.k.a. Server Solutions): Introduced in 2007 and available today for on-premises deployment, this model has the richest support and the greatest longevity, but puts a great deal of onus on the developer to write good code, because the code runs directly inside the SharePoint processes.
- Sandboxed Solutions (a.k.a. User Code Host, Partially Trusted Callers): Introduced in 2010, these solutions allowed end users to write to a subset of the SharePoint API. These solutions had severe limitations but were designed to reduce platform instability with poorly-behaved developer code. It’s no longer available on Office 365/SharePoint Online and is not planned for further investments.
I’ve been developing software for more than 25 years now. I’ve learned dozens of platforms and frameworks. I expect, at some point, the process of learning a new platform will get easier. Each time it does, to some degree, but it’s never enough.
In this article, I’m going to walk you through my first successful Unity spike. The goal is three-fold: to demonstrate how to do something useful in Unity, to provide a framework that you can use to learn Unity—or any other environment, and to develop a small component that you might be able to leverage in something you’re building.
The tangible deliverable for this project is a working 3D gauge. It has 18 segments and scales from 0 to 100%. I decided on this as my first project, because it allowed me to familiarize myself with the platform with a challenge that was somewhere between “Hello World” and SkyNet. Ultimately, this 3D gauge created the need for learning several fundamentals.
Twelve years ago, when I wrote the first articles for “Cracking the Code: Breaking Down the Software Development Roles,” I made a conscious and perhaps controversial decision to not include the database administrator or a database architect as a part of the roles. The decision was made because there were few organizations who dealt with the scale of data that required this dedicated role in the software development process. The solution architect could take care of the organization’s need to design the data structure as a part of their overall role. However, the world of data has gotten bigger since then.
Today, we’re facing more volume, greater velocity, and dynamic variety of the data sources that we’re processing. We’re not talking about the typical relational databases that have been popular for decades. The expansion of data requires a set of techniques and skills that are unlike historical approaches to data that we have been using.
Multithreading our processing of data is an improvement of the single threading approaches to data processing that popularized data processing in the 1980s; however, even these approaches, which rely on a single computer with multiple threads of execution, break down when the amount of processing necessary to extract meaning exceeds the capacity of a single machine.