The 2017 reporting season had is share of dramas but generally went very smoothly. The main issues we found with data quality have not changed in the last few years so providers, who may have had systematic collection issues in past years, were well on top of the process this year. Continuous improvement was evident!
Many issues can be corrected at the end of the year such as competency, qualification and location details, but missing and invalid student data are problematic to fix in retrospect. This year, the AVS system tightened up on a lot of validation rules which meant that what was good enough for 2016 data was not good enough for 2017 data.
We will soon be offering a Data Quality Monitoring service like the one at https://blog.couchbase.com/hyper-v-run-ubuntu-linux-windows/ which will periodically check for the most common errors (and some of the less ones) and report in summary and detail to a nominated email address.
The top 5 most common data collection problems were:
Wrong suburb / postcode
Suburbs are now checked against a recent Australia Post list. We have subscribed to this list (it’s not free any more) to keep our validation routines up to date. One quirky issue that arose this year was that many people are now using Google Maps to verify addresses. While this is theoretically more accurate than the Australia Post list because it is aware of street names and maximum numbers, we did find a notable discrepancy in that Google has “ST. IVES” but Australian Post has “ST IVES” (no period). Then there was the usual range of misspellings and simply picking the wrong post code for the suburb – often due to the student providing the wrong data. We would encourage RTOs to make your data entry operators aware of these issues and try to check the data provided as students sometimes really “don’t know their own address”.
Missing residential address
It is a requirement under the Data Provisioning Act that RTOs collect a residential address for all students residing onshore during training (including international students studying in Australia). Again, it is worth putting in place procedures during enrolment that a residential address is collected as we found numerous examples of putting “PO Box 123” in the building or street name field. There is a separate “PO Box” field when a postal address is allowed but other fields should not be used for this as it will be caught at the end of the year and is very difficult to collect at that time.
Missing USI
USIs are becoming increasingly mandatory. The short course exemption has been stopped as of 1/1/2018 and the only remaining exemptions are for learners in security-related professions due to other legislation which explicitly prevents them from being tracked on insecure systems. NCVER is pushing for USIs to be collected prior to starting training rather than just before parchments are issued, so that USIs are available in initial and midway data sets used for funding claims. One could argue whether there is a legislative basis for this, but the current “path of least resistance for RTOs” is to collect USIs prior to starting, or on the first day if practical.
Missing reason for study
This is a regular part of the student data in enrolment forms but due to it’s unique position in the enrolment details and not the student details (a student can do several courses for different reasons) it often gets missed out of the process. RTOs should ensure that their processes systematically collect and record this value. It is sometimes possible to make reasonable assumptions at the end of the year for some courses based on the context of the course, but this is not always the case and mass after-the-fact collection is usually impractical, so it is important that this is done continuously during the year.
No result for completed unit
This is a common issue for courses with flexible delivery or where there is a poor participation rate (for whatever reason). Units which are started on schedule may be delayed in finishing or students neglect to progress without notice of their intention to withdraw. Thus, many units are left supposedly finished but with not final outcome. Some of our system make the assumption that these units will finish on the day after the end of the reporting period automatically, but a better solution is to put in place processed to identify overdue outcomes and chase up the student or trainer to ensure the data reflects the real situation – either extending the scheduled finish dates or formally withdrawing the student.