A 12minute guided tour of the new Class Profile reports in PowerBI
Over the last twelve months St Andrew’s College has invested significant resourcing into developing web based reporting tools delivered via Microsoft’s PowerBI.com interface, to be used by both teachers, managers and administration staff alike. It’s worth reading some earlier posts about the rationale and features of why we have gone down this path, however some of the key reasons we settled on this solution include:
- It’s browser based – you can access it from “anywhere” and see live data. You can also bookmark certain reports in your browser for near instant access.
- There is also an app available (iPhone/iPad/Android/Windows10) so the data is accessible anytime / anywhere
- We can tweak reports / visuals quickly and easily, based off feedback from stakeholders
- Being browser based, you don’t need a local file on your computer that is “out of date” once a new version with improved features is built. What you see is always the “latest version”
- It’s part of our existing Office365 Suite, so our existing username/password logs you into the reports.
- Security permissions are centrally managed based off AD users and role based groups, including use SQL Row Level Security.
- It connects to our on-premise MS-SQL Server, allowing for scheduled data updates (hourly / daily).
Throughout the duration of Term 3 the team have been focused on delivering a new set of reports for Mr Dean McKenzie one of the Assistant Principal’s at the College with responsibilities for Data Analysis. He had provided some concept designs for how he would like to see the reports look, along with the location of the majority of the data in our Student Management System (Synergetic). Additionally, there had been changes to how the Grade Credit Average (GCA) was going to be calculated moving forward, which would see individual subject’s have a GCA calculated for the first time along with more rigid definitions of how various credits would be counted.
All of this logic had to be encoded into the ETL process that transferred the raw data from Synergetic’s MS-SQL database and into our Data Warehouse, automatically calculating the results on a daily basis and making them available to staff via the web interface of PowerBI. The end result is the following pages in a single report:
Subject GCAs Per Student:
Showing the results for a student in the current year and the previous year (click to enlarge)
This report is designed to allow a teacher to quickly select a student in their class and compare their GCA subject by subject, along with seeing how they performed the previous year. If you click the left hand image above to enlarge you will see numbers which represent:
- A selector for the current or previous year of GCA data for a student
- The teacher code (for full time classroom teachers this is automatically locked to their username meaning they only see the students in their classes. For Academic Deans or managers, they can see across a wider set of students).
- A year level filter, allowing a teacher to quickly narrow the selection of students down by the year level e.g. their Yr12 Maths students or Yr13 History students.
- The list of students arranged alphabetically that are taught by the teacher in the year level they have selected. Note these are colour coded pink/blue to give a visual cue to the teacher if they are looking for a male/female student in their class.
- A table showing each subject taken by the selected student, and their GCA (either current year or previous year depending on selection in #1 above)
- A bar graph visually displaying the same data as #5 but designed to quickly identify subjects of particular strength or weakness for the selected student. Note that the subjects are listed alphabetically and not by highest GCA to lowest, allowing for a “cityscape” effect.
- The name of the current student that is selected and the class code of the teacher who is browsing the report (useful if a teacher happens to teach a student a number of different classes).
The aim of this report is to allow a classroom teacher to quickly scan through the students in their class and identify their relative strengths/weaknesses in different subjects. It also enables them to answer a common question of teachers “I’ve a student who I think is underperforming in my class – how are they doing in other classes?”
GCA – Then and Now:
This report allows a teacher to quickly see the individual students in their class ranked by GCA from highest to lowest and compare the current year GCA in the teacher’s subject with the student’s overall GCA from the previous year. This allows a teacher, at a glance, to see who are their best performing students based off completed assessment but to also pick up if there is significant variance between previous and current performance.
In the above example, the top bar graph shows the 4th ranked student in the class (in pink) was actually the 6th ranked student (relative to the class) the previous year. Whilst this is a very small sample size, what this can show is a student who is possibly underperforming or showing improved performance relative to the students in their class – all helpful information for a teacher to consider.
The red numbers in the report are:
- Showing the classes taught by the logged in teacher. Note that this also includes co-curricular options that the teacher coaches/manages, allowing them to review academic performance for all students that they have contact time with (this was actually the #1 request we had from teachers after launching the Tutor Quadrant Dashboard earlier this year – the ability to see results for students in all areas of their involvement at school).
- A gender score card. This is simply showing the number of males / females in the class.
- Bar graph (ranked) showing students by GCA, highest to lowest for the subject taught by the teacher and in the current year.
- Bar graph (ranked) showing the same students but their previous year GCA across all subjects, again ranked highest to lowest.
- A table giving a break down of the students in the class and their GCA in individual subjects. This is helpful if a teacher wanted to compare how a student was doing in a similar subject e.g. an English teacher seeing if a student was performing comparably in other literacy intensive subjects such as History.
Comparative Scores:
This was perhaps one of the most complex and ambitious pages to put together as it was potentially combining academic data from Yr9 Entrance Testing, PAT results (Yr9-10), MidYis Results (Yr9-11) and NCEA data by GCA. Additionally, this needed to give a break down of priority learners based on identified learning needs as well as ethnicity.
The real challenge was thrown down by Mr McKenzie when he said in an ideal world he would like a teacher to be able to select from any of the historical data and have it displayed on the same graph. We explored a wide range of ideas on how we could best implement this vision and in the end the following is what was achieved:
Showing the results for a Yr13 Calculus class; on the left is the students’ Yr9 English Entrance testing and on the right their Yr13 Calculus GCA (click to enlarge)
Visually, there is a lot going on in this report and it will take the user quite some time to fully understand how best to extract what they are looking for. For this reason, all pages on these reports have user guides in text boxes and we have labelled each selection field numerically in the order that a teacher should select their data. This helps guide them through the process. In the left hand screenshot above (click to enlarge) I have added red numbers to highlight features of this report:
- The academic “score type” and “sub-score type” the teacher is wanting to see. If a teacher chose Yr9 PAT then the sub-score type would automatically display what options were available (i.e. English, Maths and Vocabulary). Similarly, if a teacher chose GCA as the score type they could choose the GCA for whatever subject they wished to check. The recent addition of search boxes from PowerBI make this process far easier to manage when there is a lot of options to choose from.
- Priority Learners – this is still being developed, but for now it highlights any students with data recorded in Synergetic, from diagnosis through to strategies to use in the classroom to support their learning.
- Ethnicity breakdown for the students in the class displayed in a pie chart and table below, along with the names of Māori and Pasifika students in the two boxes in the bottom right of the report.
- The bar chart that shows the students ranked by whichever score type the teacher has selected. Note that there are no axes on this graph, a necessary requirement given the academic data does not always share identical measures/scores. However, by placing the cursor over a student you can easily see their score e.g. a stanine for a PAT test, or a 2 decimal place GCA score for NCEA results. Additionally, there are visual cues on this graph that further help identify students with listed learning support needs or who identify as Māori or Pasifika.
A reminder that all of this data refreshes automatically each night so the teacher is always seeing the latest information on their students. Should a student leave/join the class the data is refreshed to reflect this.
NCEA Results Analysis By Standard:
One of the most requested features by the Senior Leadership Group and Heads of Department at St Andrew’s is an easy way to compare, standard by standard, how our students and teachers went compared to similar schools around New Zealand (similar schools has been defined as Decile 8-10). One of the challenges has been getting access to neatly formatted data that contains all NCEA standards, not just individual results which could be downloaded from the NZQA website.
After working with NZQA’s statistics team, we have been able to obtain this data and run it through our ETL process into the data warehouse, thus allowing this comparison to be easily done by classroom teachers:
Again, a classroom teacher would select a class they teach, and then narrow it down to a NCEA standard they wished to compare by following the numerical work flow selections on the left hand side of report. Once completed, this presents the four horizontal bar charts that show:
- Top left = All students being compared, the top bar is comparative schools nationally (all students who sat this NCEA standard in Decile 8-10 schools). The middle bar is the performance of the St Andrew’s cohort, in this case all other Yr12 history students taught by all teachers. The bottom bar is the performance of the students in this teacher’s class.
- Bottom left = Performance of Māori/Pasifika students (again broken down by national data, cohort and individual classroom teacher).
- Top right = male students.
- Bottom right = female students.
The results for these standards can be filtered to show either internal assessments only or formative assessment results for not-yet-sat external exams, providing students with a comparative score with the national data for that external standard from the previous year. This could work as a motivator for them before their external exams.
The red numbers in the screenshot are:
- Search box for the teacher to select the class code they want to analyze (again, searching is making this really easy), There are two pre-selected options visible which are the previous year’s national data and the StAC cohort data. A teacher could, in theory, turn these off if they simply want to display only their own class results and not compare them.
- Once a class is selected, this table automatically shows only standards that have a result recorded in the Synergetic database. This helps a teacher know which standard number to search for.
- Using the knowledge above, the teacher searches for the standard they want to analyse e.g. “HIST2” would show all Level 2 history standards allowing a teacher to quickly click through their results.
- The comparative graphs (as explained above). One of the neat features of this is if a teacher wanted to drill down and see which students in their class gained a certain result, they need only click the result and the list of students in the table filters immediately:

By clicking the silver “merit” grade in the bottom right graph (females) the table down the bottom filters to show the name of the student(s), allowing a teacher to quickly search through student names by result.
Detailed NCEA Results By Standard:
This final report is another one that is designed to quickly profile the range of ability of the students a teacher sees. However, it also delivers on one of the other most common requests from teachers e.g “I want to know how my Level 3 Geography students did in Level 2 Geography at the start of the year / or an internal assessment so I can better differentiate the teaching to meet their needs.” To date, we have struggled to graphically display a ranked past/present comparison tool for teachers and the security relationships is actually quite complex (just because you’re teaching the student for Level 3 Geography, for instance, does not mean you were their Level 2 Geography teacher).
This has now been displayed in the following reports:
Showing the results for a Yr13 Geography class internal assessment 3.3 (91428) on the left; on the right is the students’ performance from the previous year for the internal assessment 2.3 (91242). (click to enlarge)
These reports contain a number of visual cues. In keeping with all our NCEA reporting in PowerBI, the colour coding is consistent: Gold = Excellence; Silver = Merit; Bronze = Achievement; Red = Not Achieved. Additionally, the bars are varied in height and ranked highest to lowest allowing a teacher to very quickly pick up the grade spread of their class at a glance. The red numbers in the screenshot on the left (click to enlarge) are:
- The teacher selects the NCEA standard they wish to analyse
- They select which of their classes they wish to filter by (many of our senior teachers teach two of the same year level/subject so this is helpful). The list of classes is pre-populated automatically, based on the username the teacher signs in as making this a very simple process.
- The bar chart orders the students by result, highest to lowest (as explained above).
Concluding Thoughts:
As evidenced above, a huge amount of work and effort has gone into these reports and they certainly represent the progression of thought over the last few years in terms of what is the key data we need to be able to provide to classroom teachers. A key objective of this analytics project at St Andrew’s is to provide easy access to the data for teachers on an “anytime, anywhere” basis and for it to be easily comprehensible.
As more teachers start to use these reports on a regular basis I anticipate feedback will flow and new feature requests will emerge. The beauty of the setup currently is we can release this version of reporting to teachers and then easily add new features which will become automatically available to teachers next time they log in – there is no need to update or install new files for the teacher. To further support teachers, we are now embedding a “Tour of the Dashboard” video into the landing page of each new report:

One of the great things about being browser based is the ability to embed third party content, in this case a YouTube video explaining to teachers how they can use this new report.
These embedded videos mean that should teachers forget how to use the report, or are new to the College, they can essentially “self-train” on how they can use the report with their classes.
I am genuinely excited about this level of reporting and the benefits it will have not just for our teachers, but for our students too!