Reflections from (near) the top of the SAMR Ladder

Earlier this year I blogged about my attempts to integrate a SAMR ladder effectively into a unit of work for Year 13 Geography. A couple of weeks ago my students sat their school examinations – the results of which were the first indication of the success, or otherwise, of this approach. Below are my final thoughts around this approach:

Upon completion of teaching, and assessing student performance in this standard it is an opportune time to reflect on the impact that the teaching changes implemented in this inquiry have been effective. This evidence will take two forms; first my own reflection, and secondly student feedback in the form of a short survey.

The original aim of this inquiry was to implement a clear SAMR ladder approach to the teaching of this unit. This happened to a certain extent, though unfortunately (as I think perhaps was to be expected) when the pressure off examinations arrived it was the redefinition task which was neglected.

Positives: The Spatial Variation component of this unit was particularly successful. Because students spent much less (almost zero) class time copying down note, I found that I could dedicate a week of class time to the activity. It was a great way to allow students to discover for themselves. I blogged about the success of this teaching here. In the student survey the class was asked “During the teaching of 3.2 you used Google Earth to investigate the Spatial Variation of TD in Queenstown. How effective was that activity?” Student responses are below:

  • very
  • Because we could keep this for our final exams and keep referring back to it
  • Yes it was very helpful for providing a physical representation
  • Was very effective. Helped to remember where everything was located
  • very effective
  • Really effective!!!!
  • very helpful visualises a better picture of qtown
  • it allowed us to clearly see how attractions and features were dispersed through the Queenstown region
  • yes as it is going to help me with my external study
  • Somewhat helpful, was good seeing the content at the time making it although have not found it incredibly useful in revision.
  • Very effective, clearly showed us the spatial variations
  • Effective, but I felt like if I got behind or missed one listen I would struggle to catch up.
  • Very effective

I think that this is very good feedback and shows the effectiveness of this activity. I was also encouraged that the other two Year 13 Geography teachers invited me into their classes to do the same thing with their classes, and gave positive feedback about the effectiveness of it, it also a positive.

Impact on assessment results:

The original aim of this inquiry was twofold. On the one hand I wanted to more deliberately implement the SAMR model as a ladder during a whole topic. As discussed above I believe that I partially fulfilled this aim.

Secondly, this standard has traditionally been one where there are high numbers of SNA and N grades in the school, and NCEA examinations. This was something I wanted to address. By actively incorporating the SAMR ladder more deliberately, I was hoping to engage students in the content, and have more time to actively prepare them to achieve to their potential.

Obviously, it is always difficult, and potentially misleading to compare academic results across years. the dynamics, prior experience, and academic ability are always different, and it is hard to get a clear control group, in most cases. despite this, I feel that the results of the students so far – even when compared to the PEP for the standard; my students have performed well.

cb88a298bec680a06c40eafb9f057cab.PNG

Although the Merit and Excellence grade numbers are lower than I would like – I feel very positive that there were no students who either did not attempt, or did not Achieve, the paper. It is these students who most often will choose to SNA the paper – so them tasting some success in the school examination should hopefully relate to a zero SNA rate in the NCEA examination later this month. As shown by the PEP below, this standard has relatively low rates of M and E anyway. (reminder than the SNA grades are not included here – so a PEP is always biased towards the performance of more able students.

87dece4b6672aa1c9a3d160f8fe66ea8.PNG

Future Steps:

In 2017 I think that this approach is worthwhile continuing with. I think that students in my classes are beginning to respond to meaningful use of technology to support their learning. Next year I will focus more on implementing the higher order activities, and I will attempt to implement some such tasks that are incorporated into the unit, rather than as a summary task. Being a summary task this year meant that it was realistically unlikely to be completed under the time pressures of the end of the academic year – it was simply easier to return to a more tried and true revision program.

Technology Supporting Gifted Students

This post was written by Ms Ellen Hampson and Mrs Kelly McBride and was originally posted in the GiftEDnewz e-newsletter from the Professional Association for Gifted Education

Secondary School – overview of technology and current activity

Two concepts come to mind when contemplating Technology and its significance within education. Firstly, once hailed as the Holy Grail for its innovation within the communication field, Technological innovation increasingly holds the auspicious role of ‘global saviour’ when engineered by socially conscious citizens. Experts argue that we are now living in the age of the Anthropocene – the proposed epoch when humanity has irrevocably altered the planet’s geology and ecosystems. Can the youth of today, who will live in a world where the ‘internet of things’, ‘bio- wearables’ and ‘blockchain’ technology are the norm, turn our influence around and steer our global impact in a new direction towards a more sustainable future aided by innovative technology?

the-anthropocene-era

Marie Pellin, 2014

Secondly, technology (characterised by exponential growth) surely needs to be influenced by socially conscious citizens as eluded to above. For example, exponential growth of internet technology may be tempered by our socially aware youth favouring net neutrality. Equally so, it appears that technology is forcing companies to be better global citizens. “In the age of internet transparency, it seems corporates no longer have anywhere to hide – a spot of corporate social responsibility (CSR) whitewashing is not going to cut it anymore” (Lawson, 2016).

At St Andrew’s College we are aiming for continuous improvement as far as opportunities for ‘technology enablement’ and development of ‘computational thinking’ are concerned. Well-supported by our Technology Department’s academic expertise and our ICT Division (headed by Director, Sam McNeill, and e- Learning Integrator, Tom Adams) the additional support we offer GATE students includes: Coding Club, Neuroscience Learning Module with participation in the Australasian Brain Bee Competition for Year 11 students, Forensic Science and Astrophysics Learning Modules, Passion Projects where students have the opportunity to complete coding-based projects, online participation in the New Zealand Diplomacy Competition, attendance at University of Canterbury public lectures, such as the recent Black Hole lecture, meetings with University of Canterbury lecturers and access to technology-based opportunities and events such as the recent Singularity University workshop.

img_5917

Students taking part in the Neuroscience module

Future strategies for 2017 include: offering Geographic Information Systems modules as part of the Year 9 and 10 Academic Extension and Enrichment (ACEE) Programmes, development of the Coding Club supported by Tech tutors drawn from industry, introduction of a comprehensive robotics programme to bridge our Preparatory School’s excellent programme, facilitation and guidance for students wishing to apply for the NASA Space School, potential visits to Auckland’s Stardome Observatory and/or the Mt John Observatory, online tech learning opportunities such as edX and Coursera [the top specialisations in Coursera are all technology-based], facilitation of Orion’s Evolocity Competition, and the establishment of further connections with Christchurch’s Innovation Precinct as part of the Christchurch Tech Sector Strategy [2015-2025]. In addition we will continue to punctuate our GATE calendar with further ‘SMAC’ opportunities for intellectual growth and sharing of minds such as expanding the classroom via e-meetings.

The St Andrew’s College Secondary School GATE programme has integrated Technology as a learning area with Philosophy, Sustainability and increasingly, Global Citizenship. Continue reading

Digital Image Manipulation in English

Earlier this year I was approached by Ms Tam Yuill Proctor, the Head of Department for English. She was interested in the potential for students to use digital image manipulation during their study of static images. I thought that this an exciting project to assist with, but immediately recognised that this is an area that I had very little experience in! What was particularly exciting is the potential to expose Year 9 students to the concept and then progressively up-skill them through to Year 13 where the requirements are obviously a lot more challenging.

static

Challenges in Digital Manipulation

My limited previous experience with students in their area has taught me that students primarily fall into two categories. In any class there will be a small number of students, typically 2-5, who have extensive experience, and interest in, digital manipulation of images. These students have typically used Photoshop, and are relatively advanced in their capabilities. The second, much larger, group of students have virtually no experience in this field – and they can often be intimated at the prospect.

Finding a tool

Here at St Andrew’s College we have a range of devices in each classroom as part of our 1:1 program. As an IT team we felt that there were three main criteria that any product we were going to recommend must meet:

  • Able to be used on Mac and Windows laptops
  • Be free to download and use
  • Be complex enough for Year 13 English students

Based on these criteria we decided to investigate the potential of GIMP as a platform for these tasks. gimpEarlier in this post I mentioned the two categories that students fit. The same is true of staff. I fell, very clearly, into the second category – totally inexperienced. It was great that here was a situation that was forcing me to upskill in an area, ready to help students investigate and apply the potential gains to be had using such technology to display their understanding of curriculum content. I found Gimp to be intuitive, relatively easy to use, and it was pretty easy to apply its basic manipulation tools.

“It was great that all students were using the same platform and that they had access to technical support.” – Mrs Helaina Coote – English Teacher

Year 13 Task

The focus of the Year 13 unit of work was for students to create a 8-10 minute presentation or visual essay that explores a theme from the film studies; in this case Tsotsi. Students were being assessed against the Achievement Standard 91477 ‘Create a fluent and coherent visual text which develops, sustains, and structures ideas using verbal and visual language.’

“This standard forces students to develop grit, resilience and perseverance. Progress does not always come easily or immediately.” Mrs Helaina Coote – English Teacher

In previous years many students were attempting to use Photoshop to complete this task, but were becoming bogged down in the detail of the product, with staff frustrated that they did not necessarily have the skills to assist. This year, the decision was made to directly teach students how to use the tool, and support them during class time to use it effectively.

Prior to beginning the task students were introduced to GIMP and instructed on how to use the basic functionality of it. An important part of this was giving students time to experiment with some of the more fundamental functionality of the product such as overlaying images, changing block colours and cropping images.static2

Having had an introduction students were then in a position to begin work on their production. What was particularly important here was that students, who may have no experience in digital manipulation, felt supported. I predominantly spent time in two classes; taught my Ms Helaina Coote, and Ms Phoebe Wright.

Once the students had created a number of different images most of them chose to import them into PowerPoint so that they could add music and animations to ensure that they met the requirements of the assessment task.

capture3

For me personally what was particularly interesting was seeing the skill progression and increases in confidence that all students showed. It was also great to see the upskilling of staff as they learnt next to their students. This was echoed by both teachers involved:

“Teacher shows students willingness to learn. It is good for students to see that help is accepted. Students are supported to learn the tool.”

 

Future Challenges

This is a Challenging assessment task. On reflection there were some students who became a little engrossed in the details of each image, particularly as they we learning the tool. These students found it difficult to work fast enough to create the required number of images. Hopefully, the fact that a number of classes ranging from Y9-Y12 were also introduced to Gimp this year should hopefully enable those students to approach this task with more fluency as they progress through their English education.

This task is a perfect example of how eLearning is integrated into classrooms here at St Andrew’s College. I believe that as students add to their skill year year-on-year we will see further improvement in the complexity and quality of the digital images they are able to create. It is also a great way to support students, and staff, in learning a new tool.

St Andrew’s College Dedication of the Centennial Chapel 25th October 2016

Welcome to the livestream of the St Andrew’s College dedication of the Centennial Chapel – it starts at 10:45am on Tuesday 25th October.

This is not a traditional blog post, however there has been a huge amount of technology and student input into making this livestream happen so we trust that you enjoy it.

#CEM16 Guest Post – Connecting Educators Through TeachMeets

This post was written for the Christchurch Connected Educators blog as part of Connected Educators Month of October 2016. You can read the original post here. A similar post was written for #CEM15 about Mystery Skype which you can read here.

Earlier this year I had the privilege of attending my first ever TeachMeet and it just so happened to be in Melbourne at Ivanhoe Grammar School.  If you’re unsure of what a TeachMeet actually is, you can find more at the website http://www.teachmeet.co.nz but in short:

A TeachMeet is an organised but informal meeting (in the style of an unconference) for teachers to share good practice, practical innovations and personal insights in teaching with technology.

Participants volunteer (via the TeachMeet website) to demonstrate good practice they’ve delivered over the past year, or discuss a product that enhances classroom practice.

Source: Wikipedia

connect

Encouraging connecting at TeachMeet

With the themes of this year’s Christchurch Educators Month being “connect, innovate and collaborate” I felt that a summary blog on how TeachMeet Christchurch has gone would be appropriate.

I recognised that teachers are very busy people and wanted to keep the commitment levels to TeachMeet pretty low – a once per term meeting that ran for no more than 90mins and in true keeping with the spirit of TeachMeets, each presentation could be no longer than 7 minutes. To facilitate the launch I arranged to host first two sessions at St Andrew’s College where I was confident I could drum up some speakers and also a crowd of listeners and then used an open Google Doc for people to register. You can see the topics and attendees for TeachMeet 0.1 and TeachMeet 0.2.

I was delighted with the turnout for these events and the quality of the presentations from the speakers. Many shared something from a technology / eLearning perspective however the format allows for any educational topic to be shared. Importantly, and in keeping with the theme of connecting, the events were split in half to allow a time for networking with other teachers over a coffee.

wde

Mr Wilj Dekkers from St Andrew’s College presenting at TeachMeet 0.1

As always at events like this, there was good sharing on Twitter of what was being presented via the hashtag #TMChch and you can see a twitter recap for TeachMeet 0.1 and TeachMeet 0.2

teachmeet-montage

A montage of photos from an earlier TeachMeet in 2016

I am pleased that Jeremy Cumming (former teacher at Catholic Cathedral College and now working for the Catholic Education Office) asked to pick up the organisation and hosting of TeachMeet 0.3 that will run on 17th November and be hosted at Villa Maria College. This represents a natural progression and maturing of TeachMeet by sharing the hosting and co-ordinating responsibilities amongst teachers and schools which will naturally shape the themes and focus of each session. Ultimately, this is key for the ongoing success of TeachMeet – to be sustainable there needs to be collective responsibilities and a desire amongst teachers to want to connect with each other and share best practice from their classroom, things they are experimenting with, or research they are undertaking in post-graduate studies.

When teachers maintain a mind-set of being lifelong learners then I believe a natural outworking of this is wanting to connect at various sessions like TeachMeet and others that are routinely organised by the teaching community in wider Canterbury.

If you have never been to a TeachMeet before, can I encourage you to consider signing up at www.teachmeet.co.nz for TeachMeet 0.3 which will be the last for 2016, but hopefully just one in a long line of many more where teachers can remain connected

Using Analytics To Profile Classes

A 12minute guided tour of the new Class Profile reports in PowerBI

Over the last twelve months St Andrew’s College has invested significant resourcing into developing web based reporting tools delivered via Microsoft’s PowerBI.com interface, to be used by both teachers, managers and administration staff alike. It’s worth reading some earlier posts about the rationale and features of why we have gone down this path, however some of the key reasons we settled on this solution include:

  • It’s browser based – you can access it from “anywhere” and see live data. You can also bookmark certain reports in your browser for near instant access.
  • There is also an app available (iPhone/iPad/Android/Windows10) so the data is accessible anytime / anywhere
  • We can tweak reports / visuals quickly and easily, based off feedback from stakeholders
  • Being browser based, you don’t need a local file on your computer that is “out of date” once a new version with improved features is built. What you see is always the “latest version”
  • It’s part of our existing Office365 Suite, so our existing username/password logs you into the reports.
  • Security permissions are centrally managed based off AD users and role based groups, including use SQL Row Level Security.
  • It connects to our on-premise MS-SQL Server, allowing for scheduled data updates (hourly / daily).

Throughout the duration of Term 3 the team have been focused on delivering a new set of reports for Mr Dean McKenzie one of the Assistant Principal’s at the College with responsibilities for Data Analysis. He had provided some concept designs for how he would like to see the reports look, along with the location of the majority of the data in our Student Management System (Synergetic). Additionally, there had been changes to how the Grade Credit Average (GCA) was going to be calculated moving forward, which would see individual subject’s have a GCA calculated for the first time along with more rigid definitions of how various credits would be counted.

All of this logic had to be encoded into the ETL process that transferred the raw data from Synergetic’s MS-SQL database and into our Data Warehouse, automatically calculating the results on a daily basis and making them available to staff via the web interface of PowerBI. The end result is the following pages in a single report:

Subject GCAs Per Student:

 Showing the results for a student in the current year and the previous year (click to enlarge)

This report is designed to allow a teacher to quickly select a student in their class and compare their GCA subject by subject, along with seeing how they performed the previous year. If you click the left hand image above to enlarge you will see numbers which represent:

  1. A selector for the current or previous year of GCA data for a student
  2. The teacher code (for full time classroom teachers this is automatically locked to their username meaning they only see the students in their classes. For Academic Deans or managers, they can see across a wider set of students).
  3. A year level filter, allowing a teacher to quickly narrow the selection of students down by the year level e.g. their Yr12 Maths students or Yr13 History students.
  4. The list of students arranged alphabetically that are taught by the teacher in the year level they have selected. Note these are colour coded pink/blue to give a visual cue to the teacher if they are looking for a male/female student in their class.
  5. A table showing each subject taken by the selected student, and their GCA (either current year or previous year depending on selection in #1 above)
  6. A bar graph visually displaying the same data as #5 but designed to quickly identify subjects of particular strength or weakness for the selected student. Note that the subjects are listed alphabetically and not by highest GCA to lowest, allowing for a “cityscape” effect.
  7. The name of the current student that is selected and the class code of the teacher who is browsing the report (useful if a teacher happens to teach a student a number of different classes).

The aim of this report is to allow a classroom teacher to quickly scan through the students in their class and identify their relative strengths/weaknesses in different subjects. It also enables them to answer a common question of teachers “I’ve a student who I think is underperforming in my class – how are they doing in other classes?”

GCA – Then and Now:

gca-then-and-now

This report allows a teacher to quickly see the individual students in their class ranked by GCA from highest to lowest and compare the current year GCA in the teacher’s subject with the student’s overall GCA from the previous year. This allows a teacher, at a glance, to see who are their best performing students based off completed assessment but to also pick up if there is significant variance between previous and current performance.

In the above example, the top bar graph shows the 4th ranked student in the class (in pink) was actually the 6th ranked student (relative to the class) the previous year. Whilst this is a very small sample size, what this can show is a student who is possibly underperforming or showing improved performance relative to the students in their class – all helpful information for a teacher to consider.

The red numbers in the report are:

  1. Showing the classes taught by the logged in teacher. Note that this also includes co-curricular options that the teacher coaches/manages, allowing them to review academic performance for all students that they have contact time with (this was actually the #1 request we had from teachers after launching the Tutor Quadrant Dashboard earlier this year – the ability to see results for students in all areas of their involvement at school).
  2. A gender score card. This is simply showing the number of males / females in the class.
  3. Bar graph (ranked) showing students by GCA, highest to lowest for the subject taught by the teacher and in the current year.
  4. Bar graph (ranked) showing the same students but their previous year GCA across all subjects, again ranked highest to lowest.
  5. A table giving a break down of the students in the class and their GCA in individual subjects. This is helpful if a teacher wanted to compare how a student was doing in a similar subject e.g. an English teacher seeing if a student was performing comparably in other literacy intensive subjects such as History.

Comparative Scores:

This was perhaps one of the most complex and ambitious pages to put together as it was potentially combining academic data from Yr9 Entrance Testing, PAT results (Yr9-10), MidYis Results (Yr9-11) and NCEA data by GCA. Additionally, this needed to give a break down of priority learners based on identified learning needs as well as ethnicity.

The real challenge was thrown down by Mr McKenzie when he said in an ideal world he would like a teacher to be able to select from any of the historical data and have it displayed on the same graph. We explored a wide range of ideas on how we could best implement this vision and in the end the following is what was achieved:

 Showing the results for a Yr13 Calculus class; on the left is the students’ Yr9 English Entrance testing and on the right their Yr13 Calculus GCA (click to enlarge)

Visually, there is a lot going on in this report and it will take the user quite some time to fully understand how best to extract what they are looking for. For this reason, all pages on these reports have user guides in text boxes and we have labelled each selection field numerically in the order that a teacher should select their data. This helps guide them through the process. In the left hand screenshot above (click to enlarge) I have added red numbers to highlight features of this report:

  1. The academic “score type” and “sub-score type” the teacher is wanting to see. If a teacher chose Yr9 PAT then the sub-score type would automatically display what options were available (i.e. English, Maths and Vocabulary). Similarly, if a teacher chose GCA as the score type they could choose the GCA for whatever subject they wished to check. The recent addition of search boxes from PowerBI make this process far easier to manage when there is a lot of options to choose from.
  2. Priority Learners – this is still being developed, but for now it highlights any students with data recorded in Synergetic, from diagnosis through to strategies to use in the classroom to support their learning.
  3. Ethnicity breakdown for the students in the class displayed in a pie chart and table below, along with the names of Māori and Pasifika students in the two boxes in the bottom right of the report.
  4. The bar chart that shows the students ranked by whichever score type the teacher has selected. Note that there are no axes on this graph, a necessary requirement given the academic data does not always share identical measures/scores. However, by placing the cursor over a student you can easily see their score e.g. a stanine for a PAT test, or a 2 decimal place GCA score for NCEA results. Additionally, there are visual cues on this graph that further help identify students with listed learning support needs or who identify as Māori or Pasifika.

A reminder that all of this data refreshes automatically each night so the teacher is always seeing the latest information on their students. Should a student leave/join the class the data is refreshed to reflect this.

NCEA Results Analysis By Standard:

One of the most requested features by the Senior Leadership Group and Heads of Department at St Andrew’s is an easy way to compare, standard by standard, how our students and teachers went compared to similar schools around New Zealand (similar schools has been defined as Decile 8-10). One of the challenges has been getting access to neatly formatted data that contains all NCEA standards, not just individual results which could be downloaded from the NZQA website.

After working with NZQA’s statistics team, we have been able to obtain this data and run it through our ETL process into the data warehouse, thus allowing this comparison to be easily done by classroom teachers:

ncea-analysis

Again, a classroom teacher would select a class they teach, and then narrow it down to a NCEA standard they wished to compare by following the numerical work flow selections on the left hand side of report. Once completed, this presents the four horizontal bar charts that show:

  • Top left = All students being compared, the top bar is comparative schools nationally (all students who sat this NCEA standard in Decile 8-10 schools). The middle bar is the performance of the St Andrew’s cohort, in this case all other Yr12 history students taught by all teachers. The bottom bar is the performance of the students in this teacher’s class.
  • Bottom left = Performance of Māori/Pasifika students (again broken down by national data, cohort and individual classroom teacher).
  • Top right = male students.
  • Bottom right = female students.

The results for these standards can be filtered to show either internal assessments only or formative assessment results for not-yet-sat external exams, providing students with a comparative score with the national data for that external standard from the previous year. This could work as a motivator for them before their external exams.

The red numbers in the screenshot are:

  1. Search box for the teacher to select the class code they want to analyze (again, searching is making this really easy), There are two pre-selected options visible which are the previous year’s national data and the StAC cohort data. A teacher could, in theory, turn these off if they simply want to display only their own class results and not compare them.
  2. Once a class is selected, this table automatically shows only standards that have a result recorded in the Synergetic database. This helps a teacher know which standard number to search for.
  3. Using the knowledge above, the teacher searches for the standard they want to analyse e.g. “HIST2” would show all Level 2 history standards allowing a teacher to quickly click through their results.
  4. The comparative graphs (as explained above). One of the neat features of this is if a teacher wanted to drill down and see which students in their class gained a certain result, they need only click the result and the list of students in the table filters immediately:
filtering-ncea

By clicking the silver “merit” grade in the bottom right graph (females) the table down the bottom filters to show the name of the student(s), allowing a teacher to quickly search through student names by result.

Detailed NCEA Results By Standard:

This final report is another one that is designed to quickly profile the range of ability of the students a teacher sees. However, it also delivers on one of the other most common requests from teachers e.g “I want to know how my Level 3 Geography students did in Level 2 Geography at the start of the year / or an internal assessment so I can better differentiate the teaching to meet their needs.” To date, we have struggled to graphically display a ranked past/present comparison tool for teachers and the security relationships is actually quite complex (just because you’re teaching the student for Level 3 Geography, for instance, does not mean you were their Level 2 Geography teacher).

This has now been displayed in the following reports:

 Showing the results for a Yr13 Geography class internal assessment 3.3 (91428) on the left; on the right is the students’ performance from the previous year for the internal assessment 2.3 (91242). (click to enlarge)

These reports contain a number of visual cues. In keeping with all our NCEA reporting in PowerBI, the colour coding is consistent: Gold = Excellence; Silver = Merit; Bronze = Achievement; Red = Not Achieved. Additionally, the bars are varied in height and ranked highest to lowest allowing a teacher to very quickly pick up the grade spread of their class at a glance. The red numbers in the screenshot on the left (click to enlarge) are:

  1. The teacher selects the NCEA standard they wish to analyse
  2. They select which of their classes they wish to filter by (many of our senior teachers teach two of the same year level/subject so this is helpful). The list of classes is pre-populated automatically, based on the username the teacher signs in as making this a very simple process.
  3. The bar chart orders the students by result, highest to lowest (as explained above).

Concluding Thoughts:

As evidenced above, a huge amount of work and effort has gone into these reports and they certainly represent the progression of thought over the last few years in terms of what is the key data we need to be able to provide to classroom teachers. A key objective of this analytics project at St Andrew’s is to provide easy access to the data for teachers on an “anytime, anywhere” basis and for it to be easily comprehensible.

As more teachers start to use these reports on a regular basis I anticipate feedback will flow and new feature requests will emerge. The beauty of the setup currently is we can release this version of reporting to teachers and then easily add new features which will become automatically available to teachers next time they log in – there is no need to update or install new files for the teacher. To further support teachers, we are now embedding a “Tour of the Dashboard” video into the landing page of each new report:

dashboard-tour

One of the great things about being browser based is the ability to embed third party content, in this case a YouTube video explaining to teachers how they can use this new report.

These embedded videos mean that should teachers forget how to use the report, or are new to the College, they can essentially “self-train” on how they can use the report with their classes.

I am genuinely excited about this level of reporting and the benefits it will have not just for our teachers, but for our students too!