Using Analytics To Profile Classes

A 12minute guided tour of the new Class Profile reports in PowerBI

Over the last twelve months St Andrew’s College has invested significant resourcing into developing web based reporting tools delivered via Microsoft’s PowerBI.com interface, to be used by both teachers, managers and administration staff alike. It’s worth reading some earlier posts about the rationale and features of why we have gone down this path, however some of the key reasons we settled on this solution include:

  • It’s browser based – you can access it from “anywhere” and see live data. You can also bookmark certain reports in your browser for near instant access.
  • There is also an app available (iPhone/iPad/Android/Windows10) so the data is accessible anytime / anywhere
  • We can tweak reports / visuals quickly and easily, based off feedback from stakeholders
  • Being browser based, you don’t need a local file on your computer that is “out of date” once a new version with improved features is built. What you see is always the “latest version”
  • It’s part of our existing Office365 Suite, so our existing username/password logs you into the reports.
  • Security permissions are centrally managed based off AD users and role based groups, including use SQL Row Level Security.
  • It connects to our on-premise MS-SQL Server, allowing for scheduled data updates (hourly / daily).

Throughout the duration of Term 3 the team have been focused on delivering a new set of reports for Mr Dean McKenzie one of the Assistant Principal’s at the College with responsibilities for Data Analysis. He had provided some concept designs for how he would like to see the reports look, along with the location of the majority of the data in our Student Management System (Synergetic). Additionally, there had been changes to how the Grade Credit Average (GCA) was going to be calculated moving forward, which would see individual subject’s have a GCA calculated for the first time along with more rigid definitions of how various credits would be counted.

All of this logic had to be encoded into the ETL process that transferred the raw data from Synergetic’s MS-SQL database and into our Data Warehouse, automatically calculating the results on a daily basis and making them available to staff via the web interface of PowerBI. The end result is the following pages in a single report:

Subject GCAs Per Student:

 Showing the results for a student in the current year and the previous year (click to enlarge)

This report is designed to allow a teacher to quickly select a student in their class and compare their GCA subject by subject, along with seeing how they performed the previous year. If you click the left hand image above to enlarge you will see numbers which represent:

  1. A selector for the current or previous year of GCA data for a student
  2. The teacher code (for full time classroom teachers this is automatically locked to their username meaning they only see the students in their classes. For Academic Deans or managers, they can see across a wider set of students).
  3. A year level filter, allowing a teacher to quickly narrow the selection of students down by the year level e.g. their Yr12 Maths students or Yr13 History students.
  4. The list of students arranged alphabetically that are taught by the teacher in the year level they have selected. Note these are colour coded pink/blue to give a visual cue to the teacher if they are looking for a male/female student in their class.
  5. A table showing each subject taken by the selected student, and their GCA (either current year or previous year depending on selection in #1 above)
  6. A bar graph visually displaying the same data as #5 but designed to quickly identify subjects of particular strength or weakness for the selected student. Note that the subjects are listed alphabetically and not by highest GCA to lowest, allowing for a “cityscape” effect.
  7. The name of the current student that is selected and the class code of the teacher who is browsing the report (useful if a teacher happens to teach a student a number of different classes).

The aim of this report is to allow a classroom teacher to quickly scan through the students in their class and identify their relative strengths/weaknesses in different subjects. It also enables them to answer a common question of teachers “I’ve a student who I think is underperforming in my class – how are they doing in other classes?”

GCA – Then and Now:

gca-then-and-now

This report allows a teacher to quickly see the individual students in their class ranked by GCA from highest to lowest and compare the current year GCA in the teacher’s subject with the student’s overall GCA from the previous year. This allows a teacher, at a glance, to see who are their best performing students based off completed assessment but to also pick up if there is significant variance between previous and current performance.

In the above example, the top bar graph shows the 4th ranked student in the class (in pink) was actually the 6th ranked student (relative to the class) the previous year. Whilst this is a very small sample size, what this can show is a student who is possibly underperforming or showing improved performance relative to the students in their class – all helpful information for a teacher to consider.

The red numbers in the report are:

  1. Showing the classes taught by the logged in teacher. Note that this also includes co-curricular options that the teacher coaches/manages, allowing them to review academic performance for all students that they have contact time with (this was actually the #1 request we had from teachers after launching the Tutor Quadrant Dashboard earlier this year – the ability to see results for students in all areas of their involvement at school).
  2. A gender score card. This is simply showing the number of males / females in the class.
  3. Bar graph (ranked) showing students by GCA, highest to lowest for the subject taught by the teacher and in the current year.
  4. Bar graph (ranked) showing the same students but their previous year GCA across all subjects, again ranked highest to lowest.
  5. A table giving a break down of the students in the class and their GCA in individual subjects. This is helpful if a teacher wanted to compare how a student was doing in a similar subject e.g. an English teacher seeing if a student was performing comparably in other literacy intensive subjects such as History.

Comparative Scores:

This was perhaps one of the most complex and ambitious pages to put together as it was potentially combining academic data from Yr9 Entrance Testing, PAT results (Yr9-10), MidYis Results (Yr9-11) and NCEA data by GCA. Additionally, this needed to give a break down of priority learners based on identified learning needs as well as ethnicity.

The real challenge was thrown down by Mr McKenzie when he said in an ideal world he would like a teacher to be able to select from any of the historical data and have it displayed on the same graph. We explored a wide range of ideas on how we could best implement this vision and in the end the following is what was achieved:

 Showing the results for a Yr13 Calculus class; on the left is the students’ Yr9 English Entrance testing and on the right their Yr13 Calculus GCA (click to enlarge)

Visually, there is a lot going on in this report and it will take the user quite some time to fully understand how best to extract what they are looking for. For this reason, all pages on these reports have user guides in text boxes and we have labelled each selection field numerically in the order that a teacher should select their data. This helps guide them through the process. In the left hand screenshot above (click to enlarge) I have added red numbers to highlight features of this report:

  1. The academic “score type” and “sub-score type” the teacher is wanting to see. If a teacher chose Yr9 PAT then the sub-score type would automatically display what options were available (i.e. English, Maths and Vocabulary). Similarly, if a teacher chose GCA as the score type they could choose the GCA for whatever subject they wished to check. The recent addition of search boxes from PowerBI make this process far easier to manage when there is a lot of options to choose from.
  2. Priority Learners – this is still being developed, but for now it highlights any students with data recorded in Synergetic, from diagnosis through to strategies to use in the classroom to support their learning.
  3. Ethnicity breakdown for the students in the class displayed in a pie chart and table below, along with the names of Māori and Pasifika students in the two boxes in the bottom right of the report.
  4. The bar chart that shows the students ranked by whichever score type the teacher has selected. Note that there are no axes on this graph, a necessary requirement given the academic data does not always share identical measures/scores. However, by placing the cursor over a student you can easily see their score e.g. a stanine for a PAT test, or a 2 decimal place GCA score for NCEA results. Additionally, there are visual cues on this graph that further help identify students with listed learning support needs or who identify as Māori or Pasifika.

A reminder that all of this data refreshes automatically each night so the teacher is always seeing the latest information on their students. Should a student leave/join the class the data is refreshed to reflect this.

NCEA Results Analysis By Standard:

One of the most requested features by the Senior Leadership Group and Heads of Department at St Andrew’s is an easy way to compare, standard by standard, how our students and teachers went compared to similar schools around New Zealand (similar schools has been defined as Decile 8-10). One of the challenges has been getting access to neatly formatted data that contains all NCEA standards, not just individual results which could be downloaded from the NZQA website.

After working with NZQA’s statistics team, we have been able to obtain this data and run it through our ETL process into the data warehouse, thus allowing this comparison to be easily done by classroom teachers:

ncea-analysis

Again, a classroom teacher would select a class they teach, and then narrow it down to a NCEA standard they wished to compare by following the numerical work flow selections on the left hand side of report. Once completed, this presents the four horizontal bar charts that show:

  • Top left = All students being compared, the top bar is comparative schools nationally (all students who sat this NCEA standard in Decile 8-10 schools). The middle bar is the performance of the St Andrew’s cohort, in this case all other Yr12 history students taught by all teachers. The bottom bar is the performance of the students in this teacher’s class.
  • Bottom left = Performance of Māori/Pasifika students (again broken down by national data, cohort and individual classroom teacher).
  • Top right = male students.
  • Bottom right = female students.

The results for these standards can be filtered to show either internal assessments only or formative assessment results for not-yet-sat external exams, providing students with a comparative score with the national data for that external standard from the previous year. This could work as a motivator for them before their external exams.

The red numbers in the screenshot are:

  1. Search box for the teacher to select the class code they want to analyze (again, searching is making this really easy), There are two pre-selected options visible which are the previous year’s national data and the StAC cohort data. A teacher could, in theory, turn these off if they simply want to display only their own class results and not compare them.
  2. Once a class is selected, this table automatically shows only standards that have a result recorded in the Synergetic database. This helps a teacher know which standard number to search for.
  3. Using the knowledge above, the teacher searches for the standard they want to analyse e.g. “HIST2” would show all Level 2 history standards allowing a teacher to quickly click through their results.
  4. The comparative graphs (as explained above). One of the neat features of this is if a teacher wanted to drill down and see which students in their class gained a certain result, they need only click the result and the list of students in the table filters immediately:
filtering-ncea

By clicking the silver “merit” grade in the bottom right graph (females) the table down the bottom filters to show the name of the student(s), allowing a teacher to quickly search through student names by result.

Detailed NCEA Results By Standard:

This final report is another one that is designed to quickly profile the range of ability of the students a teacher sees. However, it also delivers on one of the other most common requests from teachers e.g “I want to know how my Level 3 Geography students did in Level 2 Geography at the start of the year / or an internal assessment so I can better differentiate the teaching to meet their needs.” To date, we have struggled to graphically display a ranked past/present comparison tool for teachers and the security relationships is actually quite complex (just because you’re teaching the student for Level 3 Geography, for instance, does not mean you were their Level 2 Geography teacher).

This has now been displayed in the following reports:

 Showing the results for a Yr13 Geography class internal assessment 3.3 (91428) on the left; on the right is the students’ performance from the previous year for the internal assessment 2.3 (91242). (click to enlarge)

These reports contain a number of visual cues. In keeping with all our NCEA reporting in PowerBI, the colour coding is consistent: Gold = Excellence; Silver = Merit; Bronze = Achievement; Red = Not Achieved. Additionally, the bars are varied in height and ranked highest to lowest allowing a teacher to very quickly pick up the grade spread of their class at a glance. The red numbers in the screenshot on the left (click to enlarge) are:

  1. The teacher selects the NCEA standard they wish to analyse
  2. They select which of their classes they wish to filter by (many of our senior teachers teach two of the same year level/subject so this is helpful). The list of classes is pre-populated automatically, based on the username the teacher signs in as making this a very simple process.
  3. The bar chart orders the students by result, highest to lowest (as explained above).

Concluding Thoughts:

As evidenced above, a huge amount of work and effort has gone into these reports and they certainly represent the progression of thought over the last few years in terms of what is the key data we need to be able to provide to classroom teachers. A key objective of this analytics project at St Andrew’s is to provide easy access to the data for teachers on an “anytime, anywhere” basis and for it to be easily comprehensible.

As more teachers start to use these reports on a regular basis I anticipate feedback will flow and new feature requests will emerge. The beauty of the setup currently is we can release this version of reporting to teachers and then easily add new features which will become automatically available to teachers next time they log in – there is no need to update or install new files for the teacher. To further support teachers, we are now embedding a “Tour of the Dashboard” video into the landing page of each new report:

dashboard-tour

One of the great things about being browser based is the ability to embed third party content, in this case a YouTube video explaining to teachers how they can use this new report.

These embedded videos mean that should teachers forget how to use the report, or are new to the College, they can essentially “self-train” on how they can use the report with their classes.

I am genuinely excited about this level of reporting and the benefits it will have not just for our teachers, but for our students too!

 

Guest Post: The Ideal Setup For A School Recording Studio

band-recording-in-one-room.png

This is a Guest Post from our Head of the Music Department Mr Duncan Ferguson who is also an Apple Distinguished Educator for 2015. He has posted here before about the integration of technology and music as well as project based learning approaches in Music. You can read the original post here.

Last year I was lucky enough to be granted the Head of Independent Schools Scholarship Trust award. This enabled me to travel to San Francisco and NYC to study how Music Technology is successfully being incorporated into high schools.

As a result of this study I have produced a document called The Music Educators Technology Survival Guide. This is a free download and takes you through recommended equipment required to setup up a music technology programme in your high school. It also provides an overview of the requirements for the NZQA Unit Standards, which you may use to assess your students’ music technology skills.

However, it’s one thing to have all the gear for teaching music technology but I’ve found the physical makeup of your studio/recording/mixing spaces, are critical to student success.

Of course, the quality of the acoustics in your recording space(s) is one of the most important factors but unless you’re involved in a new build of your department there may not be a huge amount you can do (whatever you do, don’t put egg cartons on your walls, they will only make things worse!).

But if you are lucky enough to plan a new setup this is what I recommend you aim for when you’re trying to record a rock band.

Recording Room Setup

Band recording in one room

Some important things to note:

  • All the musicians (apart from the singer) are recording in the same room at the same time but the only instrument that is actually mic’ed up in the recording room is the drum kit.
  • The guitar signal is recorded via a DI box, which is then outputted to an amplifier in a separate ‘amp’ room (using a specialized reamp device). The guitar amp is mic’ed up with one or two mics and those signals are then returned to the recording system. The guitar amp signal is then fed back to the musicians via headphones.
    Guitar Signal Flow
  • The bass player is recorded via a DI box with the signal returned to the musicians headphones. The bass track usually sounds great if you have a good quality DI (like a Radial JDI) but if you need to reamp it later and/or overdub this is also an option.
  • The singer is recorded in the mixing (or other) room with their signal coming back to the musicians’ headphones. If the quality of the singer’s track is not good enough they can be overdubbed later.

Why does this setup work so well?

Generally high schools students are not going to be good enough to record to a click track and retain a good feel, and they’re also not great at overdubbing instruments one by one. So this setup allows them to play all together as they would in a normal rehearsal room, hopefully creating a great groove.

But with our multi-room setup (i.e. having an amp room) we are able to record each instrument on to isolated tracks in our DAW so if one musician makes a minor mistake you don’t have to stop the take as you would if you had the amps in the same rooms as the drum microphones. Any minor mistakes can be cut out and re-recorded (or inserted from another take) just by the musician that made the mistake, without forcing the whole band to do another take.

Having all instruments on isolated tracks (without any ‘bleed’ from the other instruments in their tracks) allows us to fix timing and pitch issues with software like Celemony Melodyne.

On a recent session the bass player had huge trouble locking in with the drums. If the band had recorded to a click track it would be easy to ‘quantize’ the bass audio to the grid but as I said before, most high school bands aren’t good enough to be able to record to click well.

But using the new version of Melodyne 4 you are easily able to generate a ‘tempo map’ of the performance (most likely using the drum kit as your timing reference) which you can then quantize the bass to, making the two musicians perfectly in time with each other (even though they didn’t record to a click). I’ll do a full review of this software and walk through this process in a future blog.

If you want hands on, practical help with understanding how to create a recording setup like this I’m running workshops for teachers – Learning Ideas Teacher Training.

Technology Enables Efficiency in English Marking

 

For me it’s like the one stop shop … go to OneNote, open up their page, have they done it? Yes? No? Give them feedback. Sync it. Sorted!

Ms Coote describes her new workflow for marking student work

I sat down today and chatted with our English Head of Department, Ms Helaina Coote, about how using a Surface Pro 3 and Microsoft OneNote was impacting on her teaching and assessment practices for A.S. 91106 Form developed personal responses to independently read texts, supported by evidence.

The entire 15 minute conversation is in the video above (recorded using Office Mix on her Pro 3) but you can skip to a few relevant sections by using the hyperlinks below:

IMPACT ON WORKFLOWS:

This year, for the first time, Ms Coote is using Microsoft OneNote with all her English classes and this has been made easier with the introduction of the OneNote Class NoteBook, where each student has their own tab (section group in OneNote). The ability to easily receive, mark, and return feedback to students has been massive:

Whilst this has not changed the way I teach AS91106, it has completely transformed how I manage the assessment practices, allowing me to streamline the feedback I am giving to students.

With students in her senior English classes required to read, listen, watch and respond to up to six different texts across three different terms at school, historically this created a lot of paperwork to manage. In this sense the technology has impacted “massively” on the speed of getting work marked and back to students.

Handwritten feedback for students in OneNote via a Surface Pro 3

Handwritten feedback for students in OneNote via a Surface Pro 3

Previously, work was typically received via email, using Microsoft Word to insert comments or track changes, saving a copy locally, printing a copy for NZQA records and then emailing the revised copy back to the students with feedback.

“There was like triple handling”

Now, students must submit their drafts via OneNote, and after having received their feedback from Ms Coote, have two days to develop a resubmission. These changes must be colour coded so she can easily see the differences. I asked her if using a digital pen was in some ways a return to traditional ways of marking, and she commented:

The Surface Pro 3 and the digital pen allows you to blend the “old school”  with the “new.” I am still a teacher marking student submissions, but now I am using a digital pen and writing on an electronic submission. Furthermore, the feedback is literally real time – I do not even need to email it back to them.

STUDENT FEEDBACK:

As more and more electronic mediums are introduced into teaching, some senior students have pushed back on the increased visibility (and thus accountability), their teachers now have of their work. I specifically asked Ms Coote how her senior students were finding this method of submission and marking:

Student feedback has been overwhelmingly positive in terms of them receiving “written” feedback [via OneNote] … I’m able to do it much faster as well, so the pieces of paper don’t lie around on my desk for ages … it’s pretty immediate, as it’s a much more streamlined process.

Importantly, however:

Ultimately, the onus is on the student to make sure their work is in the OneNote NoteBook

Ms Coote asked a number of students for some feedback on how they are finding using OneNote for their classes and one student called Angus noted:

[Since the introduction of Class NoteBooks in OneNote] I have found it incredibly useful … I no longer have to lug around books or hand outs as it is all available on OneNote and all stored in one handy place. All my work and handouts are readily available whether I’m at home or at school its all there and backed up for when i need it. My teacher can now give me feedback on my work on OneNote using her Surface Pro 3 and and she can even hand write on it …  I can see it instantly and then make new adaptions to my work hassle free as the interface on OneNote is so easy to use.

SUMMARY:

It’s often tempting to focus on the way technology is impacting on the lives of our students and therefore I find it refreshing to hear teachers enthusiastically talking about how some of the routine aspects of teaching, such as marking, are being made easier through technology.

It is intriguing that in this example it really was the naturalness of “writing” the feedback (albeit digitally on a tablet), that appealed to both the teacher and students. I read an ICT report recently that suggested that by 2018 50% of portable “laptops” sold will be hybrids that have the ability to touch / write on them like the Surface Pro 3 that Ms Coote is using.

It is a timely reminder that many of the established practices of teaching often need only minor tweaks to achieve optimum efficiency, rather than massively overhauling them with major technological changes.

Teaching The Teachers: St Andrew’s College Staff Delivering Professional Development

StAC Logo WhiteThe end of the academic year is a busy time in most schools, but also a time that many teachers engage in professional development. It is no different at St Andrew’s College where three hour, small group sessions were organised for all secondary teaching staff as a refresher on eLearning. This professional development was run by Arnika Brown, an eLearning Integrator from Cyclone Computers who has previously worked with teachers in our Preparatory School.

This year has also seen increasing requests by other schools and organisations for our teachers to deliver professional development in the area of eLearning and technology use in the classroom. An example of this is earlier this week our Assistant Head of English Ms Tam Yuill Proctor was invited to deliver a keynote at the Dunedin English Big Day Out conference. The title of her message was “Putting the “E” of E-Learning into Teaching and Learning” and as a summary reflection of presentation she recorded a terrific Office Mix overview (click the slide below to view):

Tam Office Mix

The topics covered in the keynote included:

  • Teaching and Learning: knowledge building, learning communities, practice
  • Office365: Office Mix, OneNote and OneDrive
  • Inquiry Learning: putting it into action with Year 10
  • Blogs and Twitter: effective use of these in professional development.

Other examples of our staff delivering or facilitating professional development in eLearning recently include:

It is excellent to see teachers from St Andrew’s College being invited to share their expertise and experience with the wider teaching community, as it highlights the value our own students are receiving in their tuition. As well as requests to speak at events, the College has hosted numerous staff from other schools on visits to see eLearning in action in our classrooms.

To build on this momentum, St Andrew’s has created a new position starting in 2015 called eLearning Integrator. This role will focus on supporting innovative and best practice in eLearning amongst our teachers and I am confident this will lead to even more teachers being asked to speak at future events.