Rustici Software's
SCORM Blog

Subscribe

 Subscribe in a reader

Enter your email address:


Archive


Tweets by Tim Martin



Topics

Authors

Acronym Alert! There will be many referred to in this post.

See glossary below for easy reference.

Two years ago we answered a BAA from ADL to conduct research for updating the DoDI 1322.26- “Development, Management, and Delivery of Distributed Learning.”

Spoiler Alert– ADL accepted our proposal and we were awarded a contract to dig further into what an updated Instruction might look like and how it might be implemented.

Double Spoiler Alert– The updated Instruction, DoDI 1322.26- DISTRIBUTED LEARNING (DL) just received final approval from the Office of the Under Secretary of Defense for Personnel and Readiness.

A little more context

Given that the original Instruction was 10 years old and the world of DL had changed dramatically in that time, ADL recognized the need to both modernize the DoDI and understand the impact an updated DoDI would have across both the DoD and the eLearning industry as a whole. The objective of this project was to provide recommendations and edits to the draft, assess implications of the new Instruction and advise on implementation guidelines to help support the roll out of the new Instruction.

Rustici Software came into this from the unique perspective of understanding how our various government and military customers utilize SCORM today and the considerations and impact that adding xAPI to the Instruction would have to these organizations going forward.

We spent a good portion of 2016 working through the updated Instruction, interviewing key stakeholders within the DADLAC, considering policy implications with PIPS and providing guidance on how to manage xAPI conformance testing. The end result was an updated draft of the new DoDI 1322.26 which allows for procurement of various tools and technologies, including xAPI and provides for additional framework around the xAPI specification to ensure conformance and consistency across implementations of the spec.

TL;DR

The old DoDI was specific in directing entities to procure SCORM based eLearning solutions.

The updated DoDI continues to allow for SCORM and encourages “the (implementation of) the Experience Application Programming Interface (xAPI) and associated Learning Record Store capabilities, as practical, to enhance learning data security and interoperability.”

The wrap up

Many folks at Rustici across multiple teams were involved in this BAA- from our lead developers and SCORM technical experts to project managers and account managers.

So, when we heard the great news that the new Instruction had received final approval, we were thrilled. It means a lot to us that the work we did with ADL, PIPS, the DADLAC and others within the community has come to fruition. Even more exciting is that this new Instruction empowers those within the DoD to source the eLearning technologies that meet their needs.

Glossary

ADL: Advanced Distributed Learning
BAA: Broad Agency Announcement
DADLAC: Defense Advanced Distributed Learning Advisory Committee
DL: Distributed Learning
DoD: Department of Defense
DoDI: Department of Defense Instruction
PIPS: Potomac Institute for Policy Studies
SCORM: Sharable Content Object Reference Model
TL;DR: Too long; Didn’t read
xAPI: Experience application programming interface

No Comments | Post a comment »



We’ve launched a new services group.

bullseye

Some Background

For years, we have relied on our products to be the solution to a number of complex problems facing companies that use elearning standards. If you’re building an LMS or authoring tool and you need AICC or SCORM or, more recently, xAPI, we have products that can do the heavy lifting. That’s been our bread and butter.

But we also have insights from years of thinking about experiential data and hearing how customers report on it. And we know that the problem isn’t always solved at the immediate boundary of our products.

It’s those considerations that brought our services group to life.

What We Do

We help vendors and organizations consider how to use  elearning standards to accomplish their goals. These goals include delivering  learning materials to their people and selling their products to discerning buyers.

We work on problems related to the elearning standards, namely, AICC, SCORM, and xAPI.

In the case of SCORM and AICC, we can help with problems of thinking through how both historical and newly captured data could be expressed as xAPI; we can help rethink complex learning systems; and we can do sophisticated custom elearning development.

Of course we also think hard about xAPI, the newest in the family of learning standards we closely follow.

Where You Come In

We want you to ask us a question. You can learn more about how we’re responding to the questions we’ve already heard. These are things we anticipate. Maybe something on this list prompts a question you were getting ready to ask. So, ask away– we’re listening and ready to help.

 

1 Comment | Post a comment »



 

On August 13th, 2015, we launched a heavily revised version of tincanapi.com. Andrew Downes has been working away, as he does, creating new content. Rather than direct it all at the blog, though, he’s been rethinking and restructuring the core site and sharing his insights for first-timers, learning designers, learning product vendors, and organizations. There are countless other updates laid out below. Please spend some time with them.

Many readers of the site, though, will likely notice a significant change to our handling of the name… tincanapi.com. Years ago, Mike shared our perspective on the name, that we were going to call it Tin Can API. For some, this has been a contentious issue. With the new site, we’ve made the site behave as we have been personally for a long time. We call it whatever you call it.

On the site, you’ll notice a toggle in the upper left. If you prefer to call it Tin Can, do so. If you prefer xAPI, that’s great too. Whether you visit tincanapi.com or experienceapi.com, the site will present everything to you using your prefered name.

It comes down to this: arguing about an API’s name simply isn’t productive. We have far more important things to accomplish together.

So please, enjoy the new content. Go build a brilliant activity provider. Make some statements. Or ask us for help if you need it.
 


 

Here are the new sections of the site:

Understand

 
The existing Tin Can Explained page gives a really helpful introduction to Tin Can if you’ve never heard of it.  We’ve brought this section up to date a little and added some pages around the different components of the new enterprise learning ecosystem that Tin Can enables. We’ve also added pages targeted specifically at organizations, learning product vendors and vendors of products outside L&D.

Get Started

 
By now, if you haven’t heard of Tin Can and got a basic understanding, you’ve probably been living on mars. These days, the question we get asked most isn’t “what’s Tin Can?” but “how do I get started?” If that’s your question, then good news – we’ve created a new section just for you!

The get startedsection includes pages targeted at product vendors, content authors and organizations. It includes guides to help you see Tin Can in action, get a Learning Record Store (LRS) and run a pilot project in your organization. There’s a collection of pages to help you think about moving on from SCORM, too.

Design

 
We already had a bunch of resources for developers, but not much really aimed at learning designers. We’ve added a page outlining the impact of Tin Can on learning design, including reflections on a handful of learning models and theories in the light of Tin Can. If you’re thinking more at the strategy level, we’ve got a page on incorporating Tin Can into your learning strategy, too.

At a practical level, there’s a guide on statement design, an introduction to recipes for learning designers, and an assignment for you to try out what you learn from the new pages we’ve written.

Developers

 
The developers section was already crammed full of resources. We’ve tidied these up to make them easier to find and created an interactive statement explorer page to help you understand the structure of the statement.

The statement generator we created a few years ago was due for an update and ADL recently published a new more comprehensive statement generator. We don’t believe in reinventing the wheel, so we’ve taken the ADL tool, made it orange and included it on the site.

To help you put all these resources into practice, we’ve created a series of challenges for developers to try out writing code for Tin Can.

Webinars

 
The previous webinar list contained embedded YouTube videos for all our webinars. We’ve got so many webinar recordings now that it was getting hard to find webinars on specific topics so we’ve created a new categorized webinar list. Each of the webinars is now on its own page, making it easier to share the recording with other people.

No Comments | Post a comment »



Project Tin CanOver a year ago, we started working with ADL to figure out where SCORM should go next. There were many roads that ADL could have gone down, and they’ve chosen ours — Project Tin Can.

We’ve been building and refining the Tin Can spec and our prototypes for a while now, and it’s time for you to see what we’ve been doing. It’s also time for you to share your thoughts on Project Tin Can with ADL and find out how you can contribute.

more…

3 Comments | Post a comment »



Back in 2007, I got curious about SCORM 2004 adoption and pulled some metrics about how people were using SCORM. Well, I got curious again, but this time I took it to the next level. We’ve just published a feed of SCORM Stats that will be updated nightly. For SCORM geeks like us, these stats present a useful snapshot into how the real work is using SCORM. Go ahead and bookmark it and come back every now and then to see how things evolve.

Let’s take a look at SCORM then and now.

SCORM Versions


SCORM Versions Then
SCORM Versions Now

Then: SCORM 2004 made up about 50% of the content that was being uploaded into Test Track.

Now: SCORM 2004 makes up about 30-35% of the content uploaded into SCORM Cloud.

Conclusion: SCORM 2004 remains relevant for a significant population, but it’s adoption and usage has not increased over the years. Adoption appears to be flat. The decrease since 2007 is probably related to the more mainstream adoption of SCORM Cloud vs the early adopters using SCORM Test Track in 2007.

SCORM Versions By User


SCORM Versions By User Then
SCORM Versions By User Now

Then: About 40% of users were uploading SCORM 2004 content.

Now: About 40% of users are uploading SCORM 2004 content.

Conclusion: SCORM 2004 adoption remains flat.

Users


SCORM Test Track Users Then
SCORM Cloud Users Now

Then: About 3000 people cared enough about SCORM to try out our little application.

Now: 21,000 people have given SCORM Cloud a whirl.

Conclusion: Our little SCORM Test Track experiment was a hit. That’s nice for us, but for the broader SCORM community it show just how widespread SCORM’s adoption is. Twenty-one THOUSAND people are deep enough into SCORM to use an application like SCORM Cloud, with 500 more signing up every month. SCORM’s adoption is broader than I think anybody realizes. It is the industry workhorse.

Some other stats in that vein:

About 20,000 unique visitors visit scorm.com every month…that’s 20,000 more people every month who are interested in SCORM enough to go read about it.

About 12,000 courses are imported into SCORM Cloud every month. Twelve thousand courses, that is a lot of SCORM content being tested!

Realizing the -ilities (multiple SCOs)?

 

Then: Use of Multi-SCO content

 

Now: The use of multi-SCO content
Now: Number of SCOs in Courses

Then: About 35% of SCORM 2004 content took advantage of multiple-SCO functionality.

Now: The percentage of content using more than one SCO has increased dramatically with each new edition of SCORM 2004.

Conclusion: The improvements in each SCORM 2004 Edition have been useful in making sequencing easier to use and more effective. Or, conversely, the people who use sequencing most heavily tend to gravitate to the latest edition with the most robust functionality.

 

Realizing the -ilities (use of sequencing)?

Then: Use of Sequencing
Now: Use of Sequencing

Conclusion: The use of sequencing remains similar, but it increases with the later SCORM 2004 Editions….consistent with the conclusions above.

3 Comments | Post a comment »


Older Posts »

Browse Categories

Using the Standards

Tips, tricks and solutions for using SCORM and AICC.

Standards Evolution

Our chronicling and opinion of the evolution of SCORM.

Rustici Software

Stories about who we are and what we're up to.

Products

News about our products. Notifications of new releases and new features.

Ideas and Thoughts

Miscellaneous thoughts and ideas about e-learning, entrepreneurship and whatever else is on our minds.

Software Development

Ideas about software development and how we manage things internally.