Forum Discussion
Storyline and Workday Learning issue with reporting correct score
We have a very frustrating scenario with storyline and workday, so im hoping someone may have also experienced the same issue and can help!! We have created a 60 question quiz in storyline with a pass rate of 60% , there is a timer of 30 mins. The SCORM is published in 2004 4th edition.
Workday records the score correctly when the users complete the quiz, but the next day when checking the scores they are different in workday than what they are in the SCORM file. For instance an employee scores 75% which workday picks up, but then a few hours later it will say a different score such as 68%. Workday are adamant it's the SCORM file but when reaching out to SCORM cloud for advice they have said in order for the course to change status in an LMS, the course would need to be actively in session. I have uploaded the courses to SCORM cloud and so far they haven't changed scores which makes me think its the LMS
However when we have used Adobe Captivate for a similar quiz, we do not get this issue, so I can see why workday are saying it's the SCORM, but I don't know what we need to change in order to make it work. The settings are the exactly the same as the storyline file in terms of completion the only difference is that storyline gives us a "pass/fail" "complete/incomplete" etc option.
Has anyone had anything similar to this and if so what was the solution?
- SamHillSuper Hero
This is 100% correct "SCORM cloud for advice they have said in order for the course to change status in an LMS, the course would need to be actively in session". It was the first thing I thought when reading your issue. There is no way the course can impact the score if it has not been launched again as it needs to be actively running a session in the browser.
I'm wondering if the LMS is running some kind of overnight job that is affecting the score.
The only thing I could think that could be different between a course that doesn't have the problem and one that does would be a mastery_score being set on one course and not on the other. I'm really reaching here, as it's definitely an LMS thing. The only reason I mention mastery_score is because this can impact whether an LMS will take control of completion, but it should definitely not take control of the score.
I think the onus should be back on the LMS provider here to track an instance of this occurring. Do they have event logs that can see a score changing from X to Y. What time it changed, what made the change?
I always recommend SCORM Cloud as the benchmark for testing content, and am always confident when using it to determine if an issue exists on the content or LMS side.
On last tip, I would always take a collaborative approach with LMS admins. You will get to a solution quicker if you work together.
- NatalieSlackCommunity Member
Hi Sam,
Thank you very much for this information and advice! We have been collaborating with the lms admin and workday but just nothing seems to be going forward apart from “it’s the scorm file” it feels like we have tested the file in every single scenario possible!
I have checked scorm cloud and the score hasn’t changed on there so I would also say it could be the workday lms not communicating with the articulate course correctly.
However it does seem strange that when a course is published on adobe captivate we do not have this problem? The mastery_score is not something I’m familiar with but if this is something we can change in the settings of the scorm? Although like you said I’m not sure why the lms would still read a different score to what the users have actually achieved.
I will go back to workday with this information and see what they respond with!
- SamHillSuper Hero
Good luck Natalie. I'll keep an eye out for an update on here. I think the key thing to keep pushing with Workday is that the score is updating after the course has been closed and before it has been launched again. During this period of time, the course cannot impact any data on the LMS.
- ClaudeFillion-eCommunity Member
Hello,
We also encountered this problem with a Storyline course published in Scorm 2004 4th edition.
The course results changed the next day.
For the moment, our solution is to publish them with Rise since the problem is only with Storyline.
We'll be working with Workday to find the cause, but if anyone can help, we'd appreciate it.
Thank you- NatalieSlackCommunity Member
Hi Claude!
Here is some advice we got from SCORM cloud
"Your course does a "running average" during the assessment...for example, if you get the first one right, the course sets 100. If you miss the 2nd, the course sets 50. When I get the 3rd one right, I am at 2/3, the course sets 66. There is nothing wrong with what your course is doing. It is just calculating the score in a way it wants to. On the other hand, there is not technically anything wrong with what Workday is doing. They are modifying the reports in some manner post runtime with their ETC functionality, but they are just looking to make sure they get the right score which they believe is the highest score."
Since the SCORM file seems to send an average score for each interaction submitted when in workday, we set the course to submit all of the interactions simultaneously at the end of the quiz Here's how to set that. and this seems to work!
When using navigation buttons instead of a submit button on your quiz, it's possible that learners can skip questions and submit the quiz without realizing they missed questions, so you may want to warn them before submitting the quiz, as described here.Technically workday and articulate are working as they should, but this is the work around to ensure workday and articulate work with each other.
A bit frustrating as it doesn't seem to happen in other LMS's. Hope this helps!
- VoulaBalkosCommunity Member
I am a bit frustrated myself as I have run many test on Workday to fix the same issue. This is now my 3rd project where we are running into this issue using standalone and question banks. So far the only successful project is the one not using any feedback slides.
I've been following the scores of learners for this reason. Some scores hold for a couple of days then reset to 100% or others would change to 100% in a couple of hours of completion. I do know the learners are not going in again to retake the lessons as I would see a double enrollment.
I would like to try what is suggested above of submitting the interactions all at the end. I don't understand how to as I designed my quiz using the free form question using feedback and need a submit interaction trigger on each slide in order for the review button on the result slide to identify the correct and incorrect response.
Any suggestions would be helpful. We also publish using 2004 4th edition pass/fail