Now Reading
Ask HN: Classes discovered from implementing user-facing analytics / dashboards?

Ask HN: Classes discovered from implementing user-facing analytics / dashboards?

2023-09-28 05:53:01

I’ve worked in many analytics projects across a number of companies as a consultant. I’m a big believer in “decision support systems”. Find out what decisions your customers need to make, repeatedly, to their job. Quantify the heuristics and visualize that information (and that information only) in an easy to consume manner. More often than not that’s an email or PDF. Another advantage is that by supporting the business users they feel less threatened by the changes or technology.

I think “self-serve” analytics is silly, the idea that you put all of the data in front of people and they’ll derive “insights”. That’s not how normal people or data work. We just had a discussion on HN the other day about Facebook’s Prophet, and its pitfalls. Meanwhile we expect Joe in sales to be able to identify useful trends on a chart he made. Every company needs to forecast, regardless of their sophistication. That stuff needs to be defined by the right people and given to the users.

I agree that the term “self-service analytics” (especially the ‘analytics’ part) and “insights” just passes the wrong image of the real need of business users out there. It mixes ‘strategic insights’ with ‘operational needs’. And I think self-service needs to be about operationalizing data. Sales managers are not necessarily looking to ‘analyze’ data or ‘get an insight’. They need answers from data to manage their team. They need to track well-defined KPIs. See how their salespeople are doing and be able to have a productive meeting to tell them what they are neglecting. Customer success people need to “pull some data real quick” on the usage of the product by a certain client before a meeting.

These things happen all the time. And yet most companies out there think that the solution is to just build a bunch of dashboards, foreseeing what everyone will ask in the future. And then nobody checks the dashboards. Or finds the right one. And then they have a team of SQL translators pulling data for ad-hoc questions. That’s silly IMO.

I’m obviously biased as a founder of a self-service analytics company based on AI (https://www.veezoo.com). However that is simply my 2 cents on a subject I actually care about.

Agree with both of you, and would add that knowing who is using the system, and what they need to get out of it is really the key to making them shine.

Too many systems have too much data for too many customer categories and end up being useless to everybody.

This is it, really. I remember back during a previous section of my career when I was running BI for a manufacturing company. We were asked to web-ify some legacy reports that either ran on desktops using Access & Excel or were on older BI products (Cognos). It was shocking — at the time (I was naive) — how many business requirements were essentially “replicate Excel in a browser”, and completely divorced from the actual business processes and decisions that needed to be made.

Also, it might surprise a lot of less experienced developers just how many reporting tools are actually pieces of a workflow, not just reports. If you sniff this out during the requirements phase, do your best to convert these reports into features of an actual workflow app/system rather than allow them to persist as standalone reports.

>either ran on desktops using Access & Excel or were on older BI products (Cognos). It was shocking — at the time (I was naive)

I think some people have a skewed view if they do most of their work with VC funded/SV companies. The average person at these companies is way more data savvy than average.

But there are so many companies out there that make a ton of money and have data-unsophisticated-but-domain-wise users, and old systems. Low hanging fruit.

Have you ever had a decision maker who struggles to articulate what business decisions they want to improve? How do you handle that?

I’ve heard pretty high-level managers respond to that question with things like “we were hoping your data would tell us” in response and I’m not sure what to make of it.

>Have you ever had a decision maker who struggles to articulate what business decisions they want to improve? How do you handle that?

Hah, 90% of the time. I think a big part at being good at this job is being able to coerce that information from people.

You need a process of drilling down, kind of like the 5 Whys[0]. You want to make more profits, right? That means we need to either increase revenues or decrease costs. Are we measuring all these things (you’d be surprised at the number of seemingly successful companies who can’t)? Okay, how do we affect revenue? By increasing the number of users or increasing the revenue per user. Are we measuring those things? And on and on. It’s a perfect way to iterate, and as the company matures it can be infinitely more and more sophisticated. For lower level people, sometimes it means sitting there and watching them do their job.

[0]https://en.wikipedia.org/wiki/Five_whys

This is the right mindset for sure. Most of the time the initial question is very loosely defined, but actually having these conversations with the people who “want data”, and helping them structure their thinking is also a hugely rewarding part of working in data and analytics, and will help you advance in your career.

It can be easy to have a cynical view of what people are asking for, but in my experience there is often real value you can uncover.

One thing which helped me a lot is having a decent understanding of accounting and finance. A fun, and fairly quick, way to develop that is by taking a course on financial modelling (in excel). Modelling a business in a spreadsheet is a lot of fun, and it helps you build good intuition on the underlying “physics” of how a business makes money.

> More often than not that’s an email or PDF

> I think “self-serve” analytics is silly, the idea that you put all of the data in front of people and they’ll derive “insights”. That’s not how normal people or data work

So well said. It doesn’t shock me anymore when someone asks for a succinct summary or a PDF version rather than digging through dashboards on their own. In my company, we have a user-facing analytics product, and we added the option to take a PDF snapshot on a recurring basis and send it via email!

I’ll enjoy watching this thread evolve. Some thoughts from my experience:

– Everyone asks to translate simpler spreadsheets and Excel charts/graphs into dashboards in your BI tool of choice. As soon as it’s there, they’ll ask you why they can’t export to manage the data themselves. This vicious cycle can sometimes be stopped but is a slow-motion drag on productivity in lots of orgs.

– Build in validations, and/or work on ways to check the dashboard. Dashboards sometimes put their builders and consumers on auto-pilot. The dashboard “must be right” but could easily have a bug or inaccuracy for weeks/months/etc. that isn’t obvious without some external validation.

– The dashboard never has the “right” metrics – users will continue asking for changes. Be your best advocate and say no as a way of understanding the importance of the ask

– Related: always ask why about everything you’re building into or modifying in dashboards. Business users often ask for things without an ounce of rationale.

– Related: taking away is harder than not doing at all!

Finally, I think most dashboards miss one fundamental point. Imagine you’re the CEO/COO and you’ve got this beautiful 3 or 4-chart dashboard in front of you. What should you know about what you’re seeing? What’s the succinct summary?

I like building in spots to write 2-3 sentence executive summaries.

I work closely on BI projects but from a finance perspective. The concept I like to explain to the BI teams is that the dashboard is always just a snapshot of “what” is happening. But the underlying base level data is always needed to understand “why” it’s happening. And without the why, there’s no actual intelligence gain.

Take a metric like Average Order Value (AOV). It may be ; total sales / order quantity. But as that metric is used it’s often being compared to something like last year, last month, or a plan and anyone interested in that number is really interested in understanding the “why” it has changed from some other point in time/scenario.

For that, you actually need to bring in line item details behind orders as each order has multiple products/skus and they likely sold at different prices from a year ago or what was expected in a plan. An analysis of this has a name, price-volume-mix analysis or PVM.

I always seem to have to explain this to BI teams when I join a new company and am seeking data. I’m currently going through it with a BI team, that apparently the BI tool wouldn’t store this information. It’s like it only stores aggregate values so it’s not even possible to get base level data for analysis (without major architectural changes). I don’t know if that’s normal in BI or was an implementation decision at some point but I’ve come across this same thing on a handful of companies and as I said I really have to drive this concept for those teams. When I ask of it I’m usually met with a “why would you need that info / give us a use case”. Which means, the don’t even understand how un-intelligent their BI tool is or why the execs likely aren’t feeling like investing in BI has been worthwhile (eg. Ever build a dashboard that then goes unused? I probably wasn’t perceived as useful for some reason like this).

This could be more concise put as, understanding your end users needs. Understand the difference of what people ask for is often different than what they need. If they ask for AOV metrics, they’re really saying “I need to understand AOV” and that’s done via PVM analysis.

Similarly titled towards finance. I specialize in what I’ll call decision analytics for insurance underwriters.

> Which means, the don’t even understand how un-intelligent their BI tool is or why the execs likely aren’t feeling like investing in BI has been worthwhile

And this relates to what I was thinking about in my first comment. I once was conversing the COO of my company (my last job), at a 1000+ person company, and asked him if he thought more concise requests for things would drive productivity. He, point blank, said: “sometimes I don’t even know what I’m asking for”

I’ve remembered that moment for years. In so many situations, the actual BI/dashboard is the least important part of the puzzle. Instead it’s all of the conversation and discovery to understand the real need(s)

> He, point blank, said: “sometimes I don’t even know what I’m asking for”

Totally relate to that AND I’m often on the receiving end of those questions in a live setting (eg. board/exec meetings). Funny to stumble on this because just last week I told someone on our BI team, there is not any one “use case” I can lay out. The use case is this, assume I need to answer any random question that comes up. I need analytical enablement not a fancy dashboard in most instances. It’s not to say dashboards don’t have their place, but they’re just the easily digestible summary of underlying data that’s meant to highlight areas and raise those questions about “why…”

Brilliant summary – mirrors my thoughts and experience quite closely.

Validation/testing has always been a challenge, especially given that dashboards are by definition quite “full stack” implementations where testing just the front end or back end is not sufficient and testing both in isolation can also often be challenging due to the huge possible variations in input data.

Mocking data is also hard because dashboards may also lean a lot on database-side calculations/filtering.

All of this has lead me to take quite a full-fat approach to testing dashboards, by using a real DB populated with test data, and testing the full complete application stack (driven by something like Playwright or Cypress) as well as more granular unit tests where a mocked data layer may be used.

I’m also looking at introducing visual regression tests next time I work on this kind of thing. The visual aspects of dashboards can easily drift over time even if the data is correct. You’re often theming charting libraries for example and the compliance of the theme can drift slightly if you update the library without really checking every detail of the visual appearance/layout every time. Or you may not even notice the “visual drift”…

“Business users often ask for things without an ounce of rationale.”

Deserves extra upvotes just for this statement.

This has always been painful to me working in the data analysis and reporting space. When I get many requests for dashboards or reports that lack an answer to the question of “how will this be used?”, I seem to find the requesting groups are cost-centers in the larger organization and are somewhat obsessed with processes and procedures.

This is rarely a good group to build a career with . . .

> Finally, I think most dashboards miss one fundamental point. Imagine you’re the CEO/COO and you’ve got this beautiful 3 or 4-chart dashboard in front of you. What should you know about what you’re seeing? What’s the succinct summary?

Having been on both sides of this, I think the challenge is that the CEO/COO’s job is to figure out “what should we do about this?,” which is the right approach to coming up with that summary (it’s not just “here’s a text version of the chart”). And the corollary challenge is that, in most cases, non-technical people with domain knowledge are the ones who need to produce the analysis: so any feature incomplete dashboard is going to stymie them and any general framework that requires a technical person to step in for code or configuration is going to slow the process to a crawl.

It’s the rule (not the exception) that (especially if things are going poorly) the next step is asking more questions, which involves investigating something else in more detail. A dashboard, however pretty, is as useless as a doorknob if it doesn’t have the needed information.

I have found that dashboards per say are always great as the high-level KPI trackers, like the things you would consider hanging on a wall in an office (e.g. “revenue growth this month” or “new customers acquired”). You’ll always want to know that information, and many people of unrelated departments need to have that information shared to them.

The other helpful area is a deep-dive domain-specific analytics programs, like for example Google Analytics, where it has a very full featureset for non-technical marketing people to go in and drill down to answer questions. The UI/UX designers of that product have spent years honing and A/B testing which types of graphs to show where, and mapping out how to have people click around to find what they’re looking for, to the point it is pretty easy for non-technical people. They even have courses and certifications on how to use the system.

Organizations that try to internally build a feature-complete system like google analytics for a specific domain need to consider it like building an entire software product (even if there’s a general low/no-code BI SaaS to assist) because you’ll need collaboration between general technical experts and non-technical stakeholders with changing and vague requirements. It can be done, but likely only with years of investment and UI/UX research, just like any other software product that solves a domain problem well. In practice: millions of dollars.

Technologists often forget that Excel *is* a turing complete programming language (and it’s a functional programming paradigm too!). If an org is not committed to spend years and millions of dollars on deep dive analytics for a specific domain, the right choice is almost always using a commercial analytics system for that domain that costs less than the internal build, or embracing the trusty spreadsheet.

>I think the challenge is that the CEO/COO’s job is to figure out “what should we do about this?

Totally agree. I’d even go a little further and say the business is in trouble if the CEO doesn’t know “what we should do about this”. It’s the CEOs job to know those things, and it’s the data team’s job to provide the tools to make those decisions easier, faster and better.

> Totally agree. I’d even go a little further and say the business is in trouble if the CEO doesn’t know “what we should do about this”. It’s the CEOs job to know those things, and it’s the data team’s job to provide the tools to make those decisions easier, faster and better.

I agree with one modification. Also the CEO’s job to empower folks to pitch what they think the CEO should know too. I’ve worked in plenty of successful shops where the CEO’s answer to “what should we do” is “I’m not sure yet – what do you think?” – and that is a golden opportunity to show your chops if you’ve got the opportunity

* Design matters a lot – if it looks bad, people won’t look at it.

* Layout for dashboards is almost completely formulaic. A panel for selected high-level stats (user growth % increase from last year, user % increase from last month, # new users added), a panel for breakdowns (user growth by marketing channel, user growth by registration cohort), a panel at the top for filters (“let’s filter the entire dashboard by just this marketing channel, or just this registration cohort”) identical to all breakdowns provided, and finally a row-level drill-down (“show me the users in this particular aggregation”). It took me a very long time to learn that this design is entirely cookie-cutter for good reason. Users always want the same things: stats, breakdowns, filters and drill-downs.

* Padding matters, font matters, color palette matters, no typos matter, visual hierarchy matters (i.e. big bold numbers versus smaller grey numbers).

* Always define the key metrics first (based on fact tables). All dimensions and drill-downs in the dashboard will derive from these front-and-center stats.

* Reconcile to existing metrics before broadcasting widely – almost always, people have the same stats in extant technologies (i.e. Excel, Mixpanel, Salesforce) and will instantly find inconsistencies between your figures and the extant ones.

* The vast majority of users will be passive viewers. Very few users will be “power” EDA (exploratory data analysis) users. EDA views will look different from the view that passive viewers want – keep them separate

* Obviously, the more things done in code, which promotes modularity and composability, the fewer data integrity issues you will have

> * Layout for dashboards is almost completely formulaic. A panel for selected high-level stats (user growth % increase from last year, user % increase from last month, # new users added), a panel for breakdowns (user growth by marketing channel, user growth by registration cohort), a panel at the top for filters (“let’s filter the entire dashboard by just this marketing channel, or just this registration cohort”) identical to all breakdowns provided, and finally a row-level drill-down (“show me the users in this particular aggregation”). It took me a very long time to learn that this design is entirely cookie-cutter for good reason. Users always want the same things: stats, breakdowns, filters and drill-downs.

Is there any chance you could link an image of what a good version of this looks like?

They always seem easy at first. They’re never easy. Anyone can toss up a visualization, in fact you don’t even need to know how to code, just load up a CSV in Google Sheets and drag it into Google Data Studio.

The hard part is knowing what information to surface, and how to drive the user towards those insights in an intuitive way. You need a strong team that intersects product, data science and UX. Engineering is the least important aspect of it.

Biggest lesson: all metrics _must_ be defined in code, not manager-speak.

For instance, if a marketing head wants to plot CAC (cost of acquiring customers) over time, saying CAC is number of customers divided by marketing spend is manager-speak. Spends are budgeted higher early in the month and adjusted with actuals. Customers ask for refunds and cancel accounts. Some campaigns have volume incentives which are known later… and so on. The solution is to write well commented SQL which laymen can audit and improve.

And make sure your project plan includes milestones for explicitly aligning all stakeholders on definitions otherwise you’ll have a big hot potato game the first time the outputs show something unfavorable.

I’ve seen so many of these projects over the years, and they are almost always used for success theater, promotions or just plain ego.

– What do you hope to learn from this tool?

– Is there a less expensive way to get this information?

– The data will move 1 of three directions; up, down or stay the same. Ahead of time, what will you do in each case? Asking me to change the direction of the line is not an acceptable answer. Do we still need to make the chart? Or were all three answers the same?

– This is not a one-and-done project. The moment some visibility emerges in the fog, you will be desperate for more answers. We must set up a process for the never ending litany of questions that will emerge from this work.

– Smaller is better, incremental, fast iteration and ability to change are all far more important in dashboard work than stable, long term, deeply reliable.

– This is the conversation I even have with myself as I work on data for my own company.

> The data will move 1 of three directions; up, down or stay the same. Ahead of time, what will you do in each case? Asking me to change the direction of the line is not an acceptable answer. Do we still need to make the chart? Or were all three answers the same?

It’s a feedback system. Feedback is only useful if it can trigger behavior change. How can this measurement change the company’s behavior?

Anything else is a vanity metric.

In my experience, if your plan is to make a “dashboard”, you’re already on the wrong path. It’s too generic and says nothing about what problems you are there to solve. Think about it yourself: in how many of the products that are important in your life is there any meaningful value produced by a dashboard?

Dashboards seem alluring because we imagine that users will sit there and somehow have insights delivered to them automatically. It’s often less clear what those insights will be or what is needed to produce them, we somehow hope they will materialize by just displaying some data. Often the focus is on making pretty-looking charts (which only ever look good when you demo with picturesque fake data), because you want the product to feel colorful, welcoming and visual.

A better approach is to either make a focused tool for solving a specific problem you know users have – you won’t think of what you end up with as a “dashboard” but it might occasionally end up looking a little like one – or to make general tools that allow users to dig through data interactively to find the things they care about.

The biggest help we got was meeting directly with our customers and asking them “What would it take for you to login everyday to view this dashboard” and they clearly provided metrics and trends they care about but have a hard time getting access to the data. Also don’t get fancy with our visuals. Lots of big metric kpi visuals, tabular visuals, line charts & bar charts. They should be able to glance at the visuals and immediately known what’s going on and get sense of what the visual is conveying.

Another thing customers love is the dynamic ability we give them to be able to switch how certain visuals are grouped or what value is being displayed. We can’t for see all the different ways users will want to slice and dice the data so giving them that ability was huge.

I’m a software developer.

There is a chicken and the egg problem when it comes to designing these things.

I can ask “What do you want the dashboard to look like” and they’ll answer “I don’t know before I see the data”.

Then I’ll ask what data they want to see, and they’ll respond “What will it look like?”, or we’ll spend significant time on data collection only to find they never actually want it in a dashboard after all.

By far and away the most time consuming aspect of this entire domain is to find out what users actually want to see, as they almost never have something specific enough when they approach me.

Biggest finding for us has been that no matter how many charts / filters / options / etc. we give to our users, they always want something more.

Answers don’t just lead to Eureka moments, they lead to follow up questions and follow up questions.

Not a complaint – it’s actually great. Just an observation (and a challenge)

The phenomenal cost of hosting low latency realtime dashboards for everyone is a real cost. Tons of memory required if you want them to open quick for everyone. I wish they could be served more dynamically like if you saw a user loging, you could probably populate the query before they got to the page or something. As it was it seems like we have to serve a zillion dashboards noone is actively reading.

Worked at a place providing financial research data and models to investors. We spent a lot of effort creating infinitely flexible and customizable reporting and dashboards. Turns out no one used that. Everyone just wanted a general high level report emailed to them.

80% of the time people should display a table, 15% a time-series or line chart. The other 5% is probably wrong. Anyone that asks for pie charts, 3d charts,… isn’t a real data user 😉

I once added a speedometer for production rate compared to avg over previous X weeks, as a widget demo. It ended up on every exec’s dashboard and on a big screen.

I’m not sure what they were trying to manage, but it was purdy and looked dashboardy.

For external dashboard not internal:

– You can output the most elegant metrics, you will never know if it was the right one until you talk to actual customers. Most of the time, they don’t even understand what is presented.

– Use libraries, ui-kit made for this, it will save a huge amount of time.

– Whatever you do it will: never be enough, wrongly interpreted, used in the wrong context.

– Try to tie graph and metrics to use cases or questions. e.g titling: “Active user” vs “How many users were active* in the last 30days?” (* define active in the description) can make a huge diff in terms of comprehension

Users often catch what they see as conflicts, and you need to answer for this.

Often it’s something as a different interpretation of data in multiple places (revenue in one place, profit in another) or differing date logic (one query includes a date in the range, others are “up to” that date, etc). Caching is another issue, especially if you selectively cache only slow queries.

To minimize this, always have an explanation on the chart/card (even if it’s hidden but clickable to show)

As someone who has made a ton of grafana dashboards over the years, be prepared for users to hold it wrong. Data visualisations should fail/degrade in clear and expected ways. Users are often surprised when dashboards/charts hit some limit (eg they write a non performant query). The big query design (async first, fair queueing) is best if you’re letting users write their own queries on their own datasets.

When I used to work with D3 I found object constancy to be quite an important principle. Transitions between state are often neglected (a full state refresh is easier).

Use colours and graphical elements (generated graphs), but:

Obey rules of spacing more carefully than other rules to avoid overwhelming.

Do not use colours unless signalling information, so users can be alert and relaxed when needed.

As soon as you have more than 2 types of information, have expanding panels, which remember whether the user expanded/collapsed them.

Lastly, remember that speed of loading data is much more important for dashboards in general than a random page. Cache data, or initially load only summary data, or only load the latest day by default and then fetch the weeks data. Remember clients may make purchasing decisions based on how fast your stats page of your SaaS usage loads when they are showcasing it to their C-suite, and a 15 second wait can cost you your enterprise sale.

See Also

imo there are three core pillars you have to get right here:

1. Relevant: Don’t just build a dashboard for the sake of building a dashboard. First, understand what the goal of the user is, and what metrics they’ll want to look at to understand their progress towards that goal

2. Reliable: You only have one shot to get this one right. As soon as you present incorrect data to your users, you’ve lost their trust forever, so make sure you have solid tooling in place across your data stack that ensures data quality, from collection, through transformations to query time

3. Accessible: The data the user will be looking at needs to be either self explanatory, or the user has to have access to documentation that describes the data they’re looking at in detail.

For point 1/, here’s a framework to help you identify which metrics to focus on: https://www.avo.app/blog/tracking-the-right-product-metrics

Some thoughts:

– a clean data pipeline is critical. Is your data pipeline manageable? Is it observable? Is it monitorable? Can you make changes quickly at different stages? How do overrides work? Does your data pipeline have entitlements? (Can private data points be provisioned to specific users?)

– Should you implement your own dashboard? Or are you reinventing the wheel? Can you reuse/recycle existing BI tools? What are the licenses involved? Power BI is proprietary to microsoft and will have per user economics. Grafana is AGPL, be very careful with anything AGPL in your tech stack because it may force you to open source your code. Apache Superset is pretty cool. I’ve seen big startup valuations with off-the-shelf BI tools. If its an MVP, definitely consider using this as opposed to rolling your own.

– Making assumptions for your users is bad because users will always ask for more. So building a flexible framework where users can add/remove visuals and build their own analytics may be necessary. The flipside is this adds complexity and can confuse the user. Its a delicate balance to cater to all types of users: the basic user vs the power user.

– How do users send you feedback? Bad data point? How do you find out? Can the user address it themselves?

About a year ago my (new-ish founder) boss came to me and asked me to build him a custom dashboard. “I have all the data in a spreadsheet but I want it in a dashboard” he said. I was a specialized systems dev, only occasionally doing a bit of webdev if necessary and really didn’t have time for those kind of errands.

I showed him this tutorial I had recently seen. Just a few minutes and the thumbnail, about how to build a “dashboard” in excel.
https://youtu.be/z26zbiGJnd4?si=HWn8qTbozD8vmXiF

“Oh wow, I did not know excel might look so stunning!”.
He requested for the hyperlink, by no means did something with it in fact however was completely happy. I’m fairly positive he simply needed a shiny toy and likewise felt insufficient about “simply utilizing excel” to do his necessary founder work. Displaying him that excel can look stunning and is a strong software was sufficient. No extra feeling insufficient, no want for an precise (and even excel) dashboard.

I spent 5 years leading a data team which produced reports for hundreds of users.

In our team’s experience, the most important factor in getting engagement from users is including the right context directly within the report – definitions, caveats, annotations, narrative. This pre-empts a lot of questions about the report, but more importantly builds trust in what the data is showing (vs having a user self-serve, nervous that they’re making a decision with bad data – ultimately they’ll reach out to an analyst to get them to do the analysis for them).

The second most important factor was loading speed – we noticed that after around 8 seconds of waiting, business users would disengage with a report, or lose trust in the system presenting the information (“I think it’s broken”). Most often this resulted in people not logging in to look at reports – they were busy with tons of other things, so once they expected reports to take a while to load, they stopped coming back.

The third big finding was giving people data where they already are, in a format they understand. A complicated filter interface would drive our users nuts and turned into many hours of training and technical support. For this reason, we always wanted a simple UI with great mobile support for reports – our users were on the go and could already do most other things on their phones.

We couldn’t achieve these things in BI tools, so for important decisions, we had to move the work to tools that could offer text support, instant report loading, and a familiar and accessible format: PowerPoint, PDF, and email. Of course this is a difficult workflow to automate and maintain, but for us it was crucial to get engagement on the work we were producing, and it worked.

This experience inspired my colleague and I to start an open source BI tool which could achieve these things with a more maintainable, version controlled workflow. The tool is called Evidence (https://evidence.dev) if anybody is .

Lesson learned: start with fewer metrics and observe how they are used and interpreted. It is much easier to expand correctly from there. Collecting requirements in a single pass and building a monolith is rarely as productive as it seems – because the barrier to adding things and shifting responsibility to the dashboard is so low in the beginning, that it can easily become a dumping ground.

No matter what you do, someone will use your dashboard to post-hoc justify a pre-made decision. When it all goes wrong you’ll be blamed for making a bad dashboard.

As a developer who works on a database management system monitoring tools, user-facing monitoring dashboards have been my bane for a while. I don’t know much about the situation in other companies and products, but here are the main pain points I’ve encountered:

1. Nobody knows what to monitor exactly, every new dashboard is based on a guess.

2. Not much user feedback to base the decisions on if you don’t have much users to begin with.

3. Often, the metrics exposed by the app under the monitoring prove grossly inadequate or suitable metrics do not exist.

4. You can’t just add new metrics. Users have to update the whole distributed app for the new metric to become available. This has to be accounted for at the UI design stage.

5. Somebody has to spend a significant amount of time gathering all the information from random people in the company, because see 1.

#1 mistake people first dabbling with dashboards make is to show absolutely everything.

Don’t do that. Show only the things users need to act on what’s on the screen. Minimize the information, make it “glanceable”.

If you have a troubleshooting dashboard, and you’re showing 999 items with nothing going wrong, that one item that’s actually wrong is not going to pop.

One of the hardest challenges is ensuring alignment with the end user from ideation to delivery. It can be tough to figure out what the end user needs in the first place, let alone the details of how to define individual metrics or slice the data. This is a huge pain point for both externally and internally facing deliverables, but it’s especially tough for external clients because you’re likely a lot more limited in your ability to communicate ad-hoc to clarify things down the line. And once you’ve delivered something that’s either irrelevant or inaccurate, then it can end up being game over for the engagement (if you’re working externally) or your counterpart’s trust in your output (if you’re working internally).

So it’s super important to get on the same page RE: goals and expectations and keep that alignment going to the end – so that there aren’t any unpleasant surprises at the delivery stage. Some more on who to get involved and how here: https://www.avo.app/blog/who-should-be-involved-in-tracking-…

Similar to supervised and unsupervised learning, one can see dual paths on this journey. One path answers the questions which have been in user’s mind. The other explores unasked ones to finds new insights.

No matter what the client says, ensure your prototypes load fast. I had a project turn sour because the C level test end users couldn’t be bothered to wait 20 seconds, despite us telling them it was normal.

This is completely an aside, but whenever I see “dashboard” I think of those colorful plastic toy dashboards that are given to children sitting in the back seat of the car, so they can pretend that they’re actually driving.

A common saying in statistical consulting is that the entire job is just asking “what question are you trying to answer?” over and over again.

Building dashboards that will actually be useful requires the same approach.

Balm for my heart.

I’m looking after a decision support system at the moment, and am encountering all the challenges raised here. Glad to see my experienced is not unique.

You will often have to polish the users’ half-baked metrics. Even large orgs with teams of business analysts will leave gaps not uncovered till part way through build.

james-revisionai captured most of the main ideas.

One thing not emphasized well:

1. Make it accessible. At some point, virtually all of us will have some form of accessibility issues. 508 compliance is a solid standard for this, though can be a pain to manage without starting with it from the get-go.

2. Make it tabbable (similar to accessible).

3. For development side, make it able to client-side OR server-side render — not every dashboard will have or need a rendering server. In python, Altair is the only client-side rendering that is also interactive that I’m aware of. It’s important for payload considerations

4. Related to 3 – consider payload considerations. Make it transparent, either in debug logs or similar, how large the elements passing across the wire are.

make as many metrics as you can configurable.
what I mean is that charts origin of data should be configurable,in its form and in its colors. also allow users to filter the data incoming to the charts, users love messing up with the data before exporting them to their pointless and boring powerpoint presentation

Ah, I saw a great tweet that captured a lot of my feelings about this the other day: https://twitter.com/InsightsMachine/status/17018601232984842…

>“Information is the brand new oil.” Clive Humby, 2006

>“Most of my profession up to now appears to contain redesigning legacy experiences to make it simpler for current customers (if any) to see that they comprise completely no actionable perception with lots much less effort.” Jeff Weir, 2023

For my perspective:

Normally, I discover most customers cannot truly say whether or not they want any given quantity/visible on an ongoing foundation. So massive quantities of labor go into constructing dashboards which might be used for a really quick period of time after which discarded. In all probability we must always do a greater job on one-off analyses and solely dashboard after the actual fact.

Many customers do not truly need a dashboard, what they really need is a stay knowledge dump into excel the place they’ll pivot desk it. Possibly, possibly a bar or line chart.

Normally, I discover folks at all times ask for extra filters, extra slicers, simply limitless choices to reconfigure the information as they please. However they rapidly grow to be trapped in a swamp of their very own making, now no one is aware of how this ought to be sorted or sliced, does it even make sense to do it this manner? Individuals suppose what they need is a ‘knowledge democracy’ with a whole lot of dashboards with a whole lot of choices with a whole lot of customers and they also ask for and often obtain it. However they often simply find yourself coming again to the information staff and asking – ‘so what is the reply?’ What many orgs want is definitely a knowledge dictator.

However, dashboards do can help you set up actually good suggestions loops inside the enterprise so when you possibly can determine an ongoing constraint, work out find out how to observe it after which power folks to obtain it on an everyday cadence and be accountable to it, you may make a whole lot of headway. However that is a extra area of interest use-case than how they’re ceaselessly used and the talents concerned are completely different – much less visualization abilities, extra enterprise evaluation – and that you must be positioned to verify somebody is held accountable.

Tons of great advice in the comments. At risk of repeating others, here’s what I’ve learned working on business intelligence tools for an engineering group:

– What users ask for and what users really want are often extremely different.

– Engineering executives like to place their “thumbprint” on every business analytics dashboard. They want evidence that the “intelligence” being reported has been customized by them. It’s their way of imparting branding on the organization.

– UI/UX is far more important to users than how you handle the technical details. When discussing implementation with them, start with the UI so that they have a mental model to build from.

– Leave space to create cool things that you/your team want to make. The developers of BI dashboards often have excellent ideas for visualizing data that an end-user would not immediately think of. Leave room to “delight”.

– Never assume the data is clean or accurate (even when there are regulatory reasons for it to be either of those things)

– Not everyone’s opinion is equally valuable.

– Beware of corporate politics. I once had an analytics project completely shut down because it would expose certain weaknesses in the business that were not acceptable to discuss publicly.

Bonus: Read “Envisioning Information” by Edward Tufte.



Source Link

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top