Our approach to dashboards

Published on 31 Dec 2023 by Emma Eynon

Our approach to dashboards

What is it like to work with us on SkySpark dashboards?

We have been lucky enough to work with many clients over the past 4 years on SkySpark development projects. Most of these have involved visualisations and dashboards which has given our team a wealth of experience on the subject.

For those of you who have not yet worked with us, or perhaps anyone else on SkySpark development, we'd like to share our approach and best practice on planning and creating new visualisations.

Phase 1. Requirements gathering

Sounds pretty simple when you start out. Many people believe a dashboard should be "one-stop shop" and you can show energy analytics next to your fault detection and monitoring sections.

.... and of course you can, but ....

For a high quality dashboard, there are several areas to really consider in depth:

a.) Data use - what will this data be used for?

We find the best dashboards are clear in purpose so have an analytical view for your executives, and keep a separate monitoring view for your engineering team. Not only does this streamline the data to show, (with subsequent rules and axon functions,) but it helps to focus the user group too.

Gotcha - some previous clients have over-engineered their dashboards with mixed purpose and too much information and as a result found their users don't use them!

b.) Audience - who will benefit from the displayed data practically?

Tailoring designs for an actual use group makes it more likely they will adopt using your dashboard. Consider your user journeys and SkySpark experience.

Do the executives in a corporate suite really need to see current faults and alarms? We find they generally just want the key statistics - saving them time on investigation and clicking around. On-site engineers responding to a fault may just want nice big basic alarms to show on a mobile device, on which they can easily click and navigate to other screens, like Recommended Actions for example.

Gotcha - mixing the user journeys often leads to a poor experience for all parties. Previous clients have mixed the data use, visuals, and web page design, for different audiences. Not only did they have negative feedback from users, but they then found it hard to accommodate for feature changes and scalability.

c.) Visuals - how much does SkySpark need to change visually?

In our experience, this tends to be the number one reason for clients to request custom dashboards and visualisations in SkySpark. But how much do you really need to change?

Working with digital media can be a minefield. As an example, displaying a simple photo is not so simple when you need to display the visuals on a mobile screen and on boardroom conference screen sizes. Investing into a custom SkySpark view becomes less effective if your visuals are a poor quality or become unusable on different device types.

The goal is to have high quality visuals, no matter how your user is accessing these displays. To do this, you need to create appropriate digital media files to the highest quality and standards to avoid issues.

This may be "bread and butter" for a computer services company like ours, but we find that many Buildings and Controls companies lack the internal competencies to handle this in-house.

Gotcha - using poor quality or inappropriate digital image files can reduce the impact of your shiny new dashboard. In our earlier days, clients would provide us with a suite of read-made image files, only for us to find they were too poor quality to use. Creating replacement image files can impact on project delivery time so we now always check ahead!

d.) Data feeds - can you supply the data you want to show?

Dashboards rarely show pure direct data feeds from points, like one sensor temperature. They tend to involve rolled up analytical data or filtered conditional data, like how much electricity has cost for a building for a specified time period.

This kind of data requires programmatical rules in SkySpark to calculate these figures. In fact, it's the whole basis of SkySpark - giving us powerful and intelligent data for analytics and monitoring.

Companies often make the mistake of designing fantastical new dashboards, without really considering if they can reliably and consistently attain the data to show. Often, the varying code contribution from different programmers over time can overcomplicate and clutter your SkySpark Axon functions. Not all of these data issues are obvious unless you check your LINTING view and can cause issues later down the line.

It is ALWAYS worth investing in a data review and "checkup" to groom and improve your SkySpark Axon code. There is no better time to do this than before a new dashboard launch!

Gotcha - unoptimised Axon code in SkySpark can cause unreliable data, slow performance, and error messages on your dashboards.

The good news is that we guide all of our clients through these considerations in the Gathering Requirements phase. We can do the heavy lifting for any of these preparatory tasks and work with your team to share best practices for the future.

Phase 2. Mockup designs

Now that we know what you need, it's our turn to impress and excite you about your options.

At this stage, we work out the most effective technical architecture using as much native functionality as possible.

We then wireframe the proposed screens and graphics for building to give you appropriate time to review, discuss further with us, and amend if needed. Our designs will also offer our own recommended User Interface (UI) and User Experience (UX) options.

Alternatively at this stage, we may instead review and adopt designs provided by clients to a level of fine detail, offering our own revisions until we hit the "sweet spot".

Phase 3. Software build and collaboration

We never build and test against a live SkySpark environment - it's bad practice for many reasons. Instead, we will ask you for a sample of your SkySpark data which should include all of the typical data examples your dashboard will need to handle.

We work "offline" to build each widget; first for functionality, and then for form. This means we make sure the widget will work with all variances of sampled data before we refine the graphics and iconography.

When we are happy with each developed widget (also known as an Agile development sprint) we invite you to review and test it in your own development environment as we concurrently start on the next one.

This process ensures we capture all relevant feedback at early stages and keep development flowing at a progressive rate. We find it often prompts further thought and new feature requests which can also be accommodated much easier at this point!

Phase 4. Delivery and support

After we send over the final software revision our support for your business doesn't end there. The software is now yours to keep and we're always at hand to help with more SkySpark Assistance.

Work with us over a longer term and reap the benefits of more efficient SkySpark data, optimised performance handling, and future proofing.

Our SkySpark services

Talk with us

< Back to all articles