Using Analytics To Profile Classes

A 12minute guided tour of the new Class Profile reports in PowerBI

Over the last twelve months St Andrew’s College has invested significant resourcing into developing web based reporting tools delivered via Microsoft’s PowerBI.com interface, to be used by both teachers, managers and administration staff alike. It’s worth reading some earlier posts about the rationale and features of why we have gone down this path, however some of the key reasons we settled on this solution include:

  • It’s browser based – you can access it from “anywhere” and see live data. You can also bookmark certain reports in your browser for near instant access.
  • There is also an app available (iPhone/iPad/Android/Windows10) so the data is accessible anytime / anywhere
  • We can tweak reports / visuals quickly and easily, based off feedback from stakeholders
  • Being browser based, you don’t need a local file on your computer that is “out of date” once a new version with improved features is built. What you see is always the “latest version”
  • It’s part of our existing Office365 Suite, so our existing username/password logs you into the reports.
  • Security permissions are centrally managed based off AD users and role based groups, including use SQL Row Level Security.
  • It connects to our on-premise MS-SQL Server, allowing for scheduled data updates (hourly / daily).

Throughout the duration of Term 3 the team have been focused on delivering a new set of reports for Mr Dean McKenzie one of the Assistant Principal’s at the College with responsibilities for Data Analysis. He had provided some concept designs for how he would like to see the reports look, along with the location of the majority of the data in our Student Management System (Synergetic). Additionally, there had been changes to how the Grade Credit Average (GCA) was going to be calculated moving forward, which would see individual subject’s have a GCA calculated for the first time along with more rigid definitions of how various credits would be counted.

All of this logic had to be encoded into the ETL process that transferred the raw data from Synergetic’s MS-SQL database and into our Data Warehouse, automatically calculating the results on a daily basis and making them available to staff via the web interface of PowerBI. The end result is the following pages in a single report:

Subject GCAs Per Student:

 Showing the results for a student in the current year and the previous year (click to enlarge)

This report is designed to allow a teacher to quickly select a student in their class and compare their GCA subject by subject, along with seeing how they performed the previous year. If you click the left hand image above to enlarge you will see numbers which represent:

  1. A selector for the current or previous year of GCA data for a student
  2. The teacher code (for full time classroom teachers this is automatically locked to their username meaning they only see the students in their classes. For Academic Deans or managers, they can see across a wider set of students).
  3. A year level filter, allowing a teacher to quickly narrow the selection of students down by the year level e.g. their Yr12 Maths students or Yr13 History students.
  4. The list of students arranged alphabetically that are taught by the teacher in the year level they have selected. Note these are colour coded pink/blue to give a visual cue to the teacher if they are looking for a male/female student in their class.
  5. A table showing each subject taken by the selected student, and their GCA (either current year or previous year depending on selection in #1 above)
  6. A bar graph visually displaying the same data as #5 but designed to quickly identify subjects of particular strength or weakness for the selected student. Note that the subjects are listed alphabetically and not by highest GCA to lowest, allowing for a “cityscape” effect.
  7. The name of the current student that is selected and the class code of the teacher who is browsing the report (useful if a teacher happens to teach a student a number of different classes).

The aim of this report is to allow a classroom teacher to quickly scan through the students in their class and identify their relative strengths/weaknesses in different subjects. It also enables them to answer a common question of teachers “I’ve a student who I think is underperforming in my class – how are they doing in other classes?”

GCA – Then and Now:

gca-then-and-now

This report allows a teacher to quickly see the individual students in their class ranked by GCA from highest to lowest and compare the current year GCA in the teacher’s subject with the student’s overall GCA from the previous year. This allows a teacher, at a glance, to see who are their best performing students based off completed assessment but to also pick up if there is significant variance between previous and current performance.

In the above example, the top bar graph shows the 4th ranked student in the class (in pink) was actually the 6th ranked student (relative to the class) the previous year. Whilst this is a very small sample size, what this can show is a student who is possibly underperforming or showing improved performance relative to the students in their class – all helpful information for a teacher to consider.

The red numbers in the report are:

  1. Showing the classes taught by the logged in teacher. Note that this also includes co-curricular options that the teacher coaches/manages, allowing them to review academic performance for all students that they have contact time with (this was actually the #1 request we had from teachers after launching the Tutor Quadrant Dashboard earlier this year – the ability to see results for students in all areas of their involvement at school).
  2. A gender score card. This is simply showing the number of males / females in the class.
  3. Bar graph (ranked) showing students by GCA, highest to lowest for the subject taught by the teacher and in the current year.
  4. Bar graph (ranked) showing the same students but their previous year GCA across all subjects, again ranked highest to lowest.
  5. A table giving a break down of the students in the class and their GCA in individual subjects. This is helpful if a teacher wanted to compare how a student was doing in a similar subject e.g. an English teacher seeing if a student was performing comparably in other literacy intensive subjects such as History.

Comparative Scores:

This was perhaps one of the most complex and ambitious pages to put together as it was potentially combining academic data from Yr9 Entrance Testing, PAT results (Yr9-10), MidYis Results (Yr9-11) and NCEA data by GCA. Additionally, this needed to give a break down of priority learners based on identified learning needs as well as ethnicity.

The real challenge was thrown down by Mr McKenzie when he said in an ideal world he would like a teacher to be able to select from any of the historical data and have it displayed on the same graph. We explored a wide range of ideas on how we could best implement this vision and in the end the following is what was achieved:

 Showing the results for a Yr13 Calculus class; on the left is the students’ Yr9 English Entrance testing and on the right their Yr13 Calculus GCA (click to enlarge)

Visually, there is a lot going on in this report and it will take the user quite some time to fully understand how best to extract what they are looking for. For this reason, all pages on these reports have user guides in text boxes and we have labelled each selection field numerically in the order that a teacher should select their data. This helps guide them through the process. In the left hand screenshot above (click to enlarge) I have added red numbers to highlight features of this report:

  1. The academic “score type” and “sub-score type” the teacher is wanting to see. If a teacher chose Yr9 PAT then the sub-score type would automatically display what options were available (i.e. English, Maths and Vocabulary). Similarly, if a teacher chose GCA as the score type they could choose the GCA for whatever subject they wished to check. The recent addition of search boxes from PowerBI make this process far easier to manage when there is a lot of options to choose from.
  2. Priority Learners – this is still being developed, but for now it highlights any students with data recorded in Synergetic, from diagnosis through to strategies to use in the classroom to support their learning.
  3. Ethnicity breakdown for the students in the class displayed in a pie chart and table below, along with the names of Māori and Pasifika students in the two boxes in the bottom right of the report.
  4. The bar chart that shows the students ranked by whichever score type the teacher has selected. Note that there are no axes on this graph, a necessary requirement given the academic data does not always share identical measures/scores. However, by placing the cursor over a student you can easily see their score e.g. a stanine for a PAT test, or a 2 decimal place GCA score for NCEA results. Additionally, there are visual cues on this graph that further help identify students with listed learning support needs or who identify as Māori or Pasifika.

A reminder that all of this data refreshes automatically each night so the teacher is always seeing the latest information on their students. Should a student leave/join the class the data is refreshed to reflect this.

NCEA Results Analysis By Standard:

One of the most requested features by the Senior Leadership Group and Heads of Department at St Andrew’s is an easy way to compare, standard by standard, how our students and teachers went compared to similar schools around New Zealand (similar schools has been defined as Decile 8-10). One of the challenges has been getting access to neatly formatted data that contains all NCEA standards, not just individual results which could be downloaded from the NZQA website.

After working with NZQA’s statistics team, we have been able to obtain this data and run it through our ETL process into the data warehouse, thus allowing this comparison to be easily done by classroom teachers:

ncea-analysis

Again, a classroom teacher would select a class they teach, and then narrow it down to a NCEA standard they wished to compare by following the numerical work flow selections on the left hand side of report. Once completed, this presents the four horizontal bar charts that show:

  • Top left = All students being compared, the top bar is comparative schools nationally (all students who sat this NCEA standard in Decile 8-10 schools). The middle bar is the performance of the St Andrew’s cohort, in this case all other Yr12 history students taught by all teachers. The bottom bar is the performance of the students in this teacher’s class.
  • Bottom left = Performance of Māori/Pasifika students (again broken down by national data, cohort and individual classroom teacher).
  • Top right = male students.
  • Bottom right = female students.

The results for these standards can be filtered to show either internal assessments only or formative assessment results for not-yet-sat external exams, providing students with a comparative score with the national data for that external standard from the previous year. This could work as a motivator for them before their external exams.

The red numbers in the screenshot are:

  1. Search box for the teacher to select the class code they want to analyze (again, searching is making this really easy), There are two pre-selected options visible which are the previous year’s national data and the StAC cohort data. A teacher could, in theory, turn these off if they simply want to display only their own class results and not compare them.
  2. Once a class is selected, this table automatically shows only standards that have a result recorded in the Synergetic database. This helps a teacher know which standard number to search for.
  3. Using the knowledge above, the teacher searches for the standard they want to analyse e.g. “HIST2” would show all Level 2 history standards allowing a teacher to quickly click through their results.
  4. The comparative graphs (as explained above). One of the neat features of this is if a teacher wanted to drill down and see which students in their class gained a certain result, they need only click the result and the list of students in the table filters immediately:
filtering-ncea

By clicking the silver “merit” grade in the bottom right graph (females) the table down the bottom filters to show the name of the student(s), allowing a teacher to quickly search through student names by result.

Detailed NCEA Results By Standard:

This final report is another one that is designed to quickly profile the range of ability of the students a teacher sees. However, it also delivers on one of the other most common requests from teachers e.g “I want to know how my Level 3 Geography students did in Level 2 Geography at the start of the year / or an internal assessment so I can better differentiate the teaching to meet their needs.” To date, we have struggled to graphically display a ranked past/present comparison tool for teachers and the security relationships is actually quite complex (just because you’re teaching the student for Level 3 Geography, for instance, does not mean you were their Level 2 Geography teacher).

This has now been displayed in the following reports:

 Showing the results for a Yr13 Geography class internal assessment 3.3 (91428) on the left; on the right is the students’ performance from the previous year for the internal assessment 2.3 (91242). (click to enlarge)

These reports contain a number of visual cues. In keeping with all our NCEA reporting in PowerBI, the colour coding is consistent: Gold = Excellence; Silver = Merit; Bronze = Achievement; Red = Not Achieved. Additionally, the bars are varied in height and ranked highest to lowest allowing a teacher to very quickly pick up the grade spread of their class at a glance. The red numbers in the screenshot on the left (click to enlarge) are:

  1. The teacher selects the NCEA standard they wish to analyse
  2. They select which of their classes they wish to filter by (many of our senior teachers teach two of the same year level/subject so this is helpful). The list of classes is pre-populated automatically, based on the username the teacher signs in as making this a very simple process.
  3. The bar chart orders the students by result, highest to lowest (as explained above).

Concluding Thoughts:

As evidenced above, a huge amount of work and effort has gone into these reports and they certainly represent the progression of thought over the last few years in terms of what is the key data we need to be able to provide to classroom teachers. A key objective of this analytics project at St Andrew’s is to provide easy access to the data for teachers on an “anytime, anywhere” basis and for it to be easily comprehensible.

As more teachers start to use these reports on a regular basis I anticipate feedback will flow and new feature requests will emerge. The beauty of the setup currently is we can release this version of reporting to teachers and then easily add new features which will become automatically available to teachers next time they log in – there is no need to update or install new files for the teacher. To further support teachers, we are now embedding a “Tour of the Dashboard” video into the landing page of each new report:

dashboard-tour

One of the great things about being browser based is the ability to embed third party content, in this case a YouTube video explaining to teachers how they can use this new report.

These embedded videos mean that should teachers forget how to use the report, or are new to the College, they can essentially “self-train” on how they can use the report with their classes.

I am genuinely excited about this level of reporting and the benefits it will have not just for our teachers, but for our students too!

 

Online Voting For Student Leaders

The aim of this blog is always to share some of the things going on with technology at St Andrew’s College and, wherever possible, provide some ideas and inspirations for other schools as well. Some of these innovations take considerable planning and resources such as our work with PowerBI for Educational Analytics, whereas others like this post about online voting are relatively simple.

Recently, the College’s new Head of Senior College Mr John Ruge approached me about moving Prefect voting to an online system. Immediately, there were some questions around how to do his securely and fairly. Paramount in my thinking was ensuring:

  • Results were anonymous
  • Students and staff could only vote once
  • Restrictions could be placed on the number of potential Prefects one could vote for
  • Time limits could be enforced for when voting stopped.

A number of people recommended using something like Google Forms or Office365 Forms, both of which are excellent products when used for what they were designed for. The major limitation, however, was there is no way to ensure the voting would be both anonymous and limited to one vote per person. I decided to cast my net a little wider and utilise the excellent Techies For Schools NZ Google Group as well as the Australian MITIE Forum and see if I could crowdsource some alternatives. Some of these included:

It was the latter that caught my attention because it was suggested that using some of the more advanced features around emailing would achieve my main aims of anonymity and restrictions to one vote per person.

SurveyMonkey Setup For Prefect Voting:

We used a basic MS-Query to extract student and staff email addresses and first/last names from Synergetic, our Student Management System. We then loaded these into a CSV file with the first row indicating the header fields:

CSV

We needed to analyse votes from three different groups of people:

  • Secondary School Teaching Staff
  • Current Year 13 Prefects
  • Current Year 12 Students

Consequently, we decided to make three identical surveys, but have the different groups above loaded into separate CSV files. Upon setting these up in SurveyMonkey we needed to select “Send by Email” to ensure unique links generated for each voter, rather than a generic link that could be forwarded to people outside the intended voters, or used more than once by the same person:

Send by Email

Choosing “Send by Email” was a key part of achieving the defined aims of online voting.

When choosing “Send by Email” you are invited to submit users from a range of sources and we used the CSV file we had already generated:

Import CSV

You are then able to compose an HTML message to the voter that is sent by SurveyMonkey based off the information from the CSV:

Composition.png

Note the salutation: the use of variables [FirstName] and [LastName] will personalise each email based off the information from the CSV already loaded into SurveyMonkey

Numerous additional variables can be set, some of which we made use of because of our aims included:

  • Changes: Respondents can change their answers on any survey page until they complete the survey (alternatively you can allow no changes at all, right through to changes after it’s been submitted but before the cut off date
  • Anonymous Responses: exclude ALL respondent information (names, email addresses, IP addresses, and custom data) from your survey results (we chose this, but you can collect all of the above information if you wished)
  • Cutoff Date & Time: This was important to ensure timely voting:

Cutoff Date

The end result, when sent, provided a really smart looking HTML email that encouraged staff and students to vote for 2017 Prefect Leaders:

SME Vote Now

Note the personalised salutation, the HTML “Vote Now” button and the footer indicating the URL is unique to the recipient.

When votes are opened you can track in real time the number of votes completed, as well as email opens and partial votes, for example:

Vote Stats

One of the final tweaks I learnt through this process was how to limit or restrict the number of choices a voter could make from a multi-choice question. This was significant as voters were allowed to select up to twenty student names from the long list of candidates. There were some help instructions available, but the key areas to check were in the options of the multi-choice question:

Multi Choice Question

Note that:

  1. For this to work “Require an Answer to This Question” is ticked
  2. You choose “at most” for number of choices if you want voters to be able to select up to but not exceeding a number of candidates
  3. You can customise the error message if a voter chooses more than the allowed number of candidates when voting.

With voting completed, it was easy to export as a PDF the graphs showing the candidates with the most votes and allow the leadership team to analyse the data. Now that we know we can generate personalised, single-use and anonymous voting systems through SurveyMonkey I can anticipate we will use this in other areas as well.

PowerBI Supports StAC’s Pastoral Care Programmes

PowerBI

For the last 12 months we have been actively exploring how Microsoft’s Business Intelligence product called PowerBI could be used at St Andrew’s College. I have blogged about our initial experimentations with this here and that post would be a useful piece of pre-reading to provide context to this post.

This week has seen the culmination of a huge amount of work over the last four months, with Tutors being given access to what is being called the Tutor Quadrant Report. Below is a screencast showing some of the features of this (with identifying details blanked out):

Demonstrating The Tutor Quadrant Report

Initial feedback from Tutors has been very positive as they recognise this new report presents a significant step forward in terms of:

  • Ease of access – using their existing school username/password to access the report on any device with a browser and anywhere (they are not restricted to being on the College campus).
  • The collation of disparate data presented in an easily comprehensible, highly visual format. Previously, to obtain information on attendance, NCEA results, discipline and Fortnightly Notes would have required dozens of clicks, different windows and reports, and even using different platforms (both Synergetic and Sharepoint).
  • Speed – the reports load very quickly in the browser.

It is satisfying to hear this type of feedback given the significant level of investment and effort that has been made in developing this platform from where we were 12 months ago.

AIS.pngFor some understanding of this journey from Crystal Reporting through to PowerBI, the following video is a quick version of a presentation Mr Dave Neilson and I gave at the Association of Independent Schools of New South Wales’ annual ICT Leadership Conference 2016 in Canberra in May. The theme of the conference was Supporting Digital School Improvement and you can download a copy of the slides from this presentation here.

The abbreviated presentation from AIS NSW ICT Leadership Conference 2016

Some of the key points from this presentation include:

  • There are multiple ways you can use Microsoft PowerBI, we have explored two methods of deployment:
    • Manually generating reports in the free PowerBI Desktop App and publishing and storing content in the Azure blob in the cloud (quick and easy, but limited security options)
    • Developing an on-premise data warehouse and using ETL processes to extract data from various sources before loading into a tabular data model and connecting to the cloud via SSAS Gateway Connector. This is also very secure when implemented with row level security.
  • PowerBi was preferred at St Andrew’s College for a number of reasons, including:
    • It’s scaleable – educational pricing for Pro licenses is affordable  (~$4/m per user) and easily managed within the Office365 Licensing Administration area
    • It is easy to access – teachers can use their existing school username/password so there is limited barriers to entry and it is accessible via a browser from any device.
  • Visualisations of data are excellent. The ability to transform what was previously stored in spreadsheets and rows and columns of data into easily comprehensible displays is critical. There is a range of default visualisations as well as third party generated ones.

Next Steps:

time to talk

Time to talk! The power of a visual to highlight a student trending in the wrong direction.

With the release of the Tutor Quadrant Report, planning is already underway for the development of further reports for both teaching staff as well as administrative staff. The migration to PowerBI of an existing Tableau report that our Director of Development used has been completed and this enables her to now access data refreshed daily and drill down using the self-service elements of PowerBI. Pleasingly, she has already identified a number of enhancements she would like to see – this is something we anticipated would occur once the end users started getting more meaningful access to the data.

Additionally, rebuilding a very detailed NCEA report similar to what we explored in the original proof of concept  will be important for academic staff to monitor progress as the year progresses. Ideally, we should see some accelerated development now that the backend infrastructure is in place.

Lastly, there is rapid development happening on the PowerBI platform all the time. One of the most exciting developments is the ability to embed reports into an existing website or portal and even apps, opening up a huge range of possibilities where we could securely share reports like those above with students and parents. For now, that is in the medium to long term planning, as we focus on rapidly deploying further PowerBI reports for the College staff.

The Power Of Data

PowerBIIncreasingly, most organisations are seeking to “do more” with the data they collect and store and in this respect, St Andrew’s College is no different. For the last two years I have been looking at a number of tools that would allow us to easily collect, analyse, display and share critical information amongst key stakeholders.

As part of this investigation we have looked at tools such as Crystal Reports, Tableau and Microsoft Excel connected via MS-Query to our MS-SQL server and outputting pivot tables for analysis. Here are some examples of this:

In the end, we decided to progress with Microsoft’s PowerBI solution, which is described on their website as:

Power BI transforms your company’s data into rich visuals for you to collect and organize so you can focus on what matters to you. Stay in the know, spot trends as they happen, and push your business further.

Some of the reasons we selected this solution include:

  • It’s browser based – you can access it from “anywhere” and see live data. You can also bookmark certain reports in your browser for near instant access.
  • There is also an app available (iPhone/iPad/Android/Windows10) so the data is accessible anytime / anywhere
  • We can tweak reports / visuals quickly and easily, based off feedback from stakeholders
  • Being browser based, you don’t need a local file on your computer that is “out of date” once a new version with improved features is built. What you see is always the “latest version”
  • It’s part of our existing Office365 Suite, so our existing username/password logs you into the reports.
  • Security permissions are centrally managed based off AD users and role based groups.
  • It connects to our on-premise MS-SQL Server, allowing for scheduled data updates (hourly / daily).

To best demonstrate the power of this tool, we built a proof of concept based around analysing NCEA student achievement, in particular University Entrance requirements and course/subject endorsement. Here is a screencast walking through the tool:

Note: identifiable data such as student names / ID numbers have been blurred out in this video.

To accelerate the development of some of this reporting, we have:

  • Partnered with DataCom New Zealand and are getting expert advice from their Business Intelligence team in terms of configuring the ETL process via Microsoft SSIS, building a tabular data model and connecting to PowerBI in the cloud for presenting the data to staff.
  • Hired a new staff member to join the ICT Services team in the role of Business Intelligence Report Writer. The responsibilities for this role will be to interface with the various business units in the College (e.g. Academic Data, Enrolments, Development, Communications etc), understand their reporting requirements and then build the reports in PowerBI.

The key with any Business Intelligence project is to help inform the decision making process and not just be contented with pretty visualisations. To that end, a robust conversation and scoping of what is required to be seen by the stakeholder needs to be established. However, with a wide range of visualisations being added regularly to PowerBI, there is a number of ways to present data in an easily comprehensible format. One of my favourites in a 3D, interactive globe that significantly improves on the PowerMap in Excel (see above):

This visualisation could be very useful in mapping where our current students or Old Collegians live or identifying where donations are coming from globally mapped either by volume or value for example.

We are in the very early stages of this project, yet the potential is very obvious to the leadership teams at St Andrew’s. The focus over the next few weeks will be configuring the backend infrastructure: the ETL processes (Extraction, Transformation, Loading), the Data Warehouse and the connectivity into PowerBI. Subsequently, the rapid development of reporting dashboards will proceed.

If this interests you, please do check back regularly on the blog for updates or drop a comment below to discuss further.

Integrating Student Data Into Moodle

moodleMoodle is the Learning Management System (LMS) used by St Andrew’s College at all year levels and is renowned for being very customisable due to it’s Open Source code base. Recently, we partnered with Catalyst NZ, experts in Open Source Software (OSS), on an integration project to extract data from the College’s MS-SQL based Student Management System (SMS) called Synergetic and display this to students directly in Moodle.

The project started with a request from the Deputy Rector, Mr Roland Burrows, for students to be able to see their Fortnightly Notes scores directly. This form of reporting is new in 2015 and the rationale behind these Fortnightly Notes is explained by Mr Burrows:

They provide the opportunity for teachers to regularly report to parents on the contribution that their son/daughter is making to his/her own learning through their attitude and effort

Parents can log into the Parent Portal to see a PDF summary of these scores that looks something like this:

A redacted example of a student's Fortnightly Notes score

A redacted example of a student’s Fortnightly Notes score

As the name implies, the Parent Portal is not available to students directly, so choosing Moodle was an obvious choice to display this data to students. Whilst considering the possible layout options of this data, we decided against replicating the table view that parents see above and instead decided to present a line graph to students that would visually reflect their attitude and effort in each subject across the entire year. Additionally, displaying Student Attendance information on a per-class basis, along with NCEA results-to-date would add real value for students at St Andrew’s so these features were added to the scope of the project.

CONFIGURATION:

To achieve these outcomes, we needed to partner with Catalyst for the Moodle configuration, and they proceeded to write a custom plugin that would extract data from the MS-SQL database powering Synergetic, import it into Moodle’s mySQL database and then present it to the students. To achieve this, three custom SQL views were created that collated only the information to be displayed in Moodle: NCEA summaries, Attendance percentages and Fortnightly Notes scores. A sample of this data can be seen in this gallery:

This slideshow requires JavaScript.

New navigation options under Site Pages

New navigation options under Site Pages

Lastly, the Moodle theme used on the St Andrew’s College Moodle site was modified to include new navigation for students to access this information. Extensive testing was then completed on our development Moodle site to iron out bugs and ensure that the information was being displayed correctly.

DISPLAYING THE DATA:

It was decided to graph a student’s Fortnightly Notes score and Attendance information at the top of each course page in Moodle, so they could immediately see their effort and attendance on a per-class basis. This can be seen in the following animated GIF image as the graphs are generated in real time:

Gif1

An animated image showing the Fortnightly Notes line graph and Attendance pie chart being generated on a student’s course page

However, to also replicate the Parent Portal summary view, a student could see the scores for all classes on a new page added under the “Site Pages” navigation menu:

Summary

Summary view of Fortnightly Notes scores and Class Attendance

For students working towards NCEA they can now see their recorded grades directly in their Moodle account:

A Yr11 Student's NCEA resutls

A Yr11 Student’s NCEA results

FINAL THOUGHTS:

Working on this project was exciting for a number of reasons. Firstly, it was our initial attempt to extract meaningful student data from our SMS and display it elsewhere. Secondly, we believe that providing this information to students visually, and displaying it to them every single time they log into their Moodle course pages, will encourage them with their effort and attitude in class. Lastly, it represents the tip of the iceberg in terms of what is possible to present to students in Moodle.

In many ways, this was a proof of concept: could we extract data and present it meaningfully to students in one of our primary learning platforms? Thanks to the customisation options within Moodle, we were able to achieve this and with other recent testing of Microsoft’s new Business Intelligence platform called PowerBI, we anticipate being able to add even more visual information to students in this way (for those interested, the animated graphs are powered by the open source rGraph tool).