Spring’s Got Speed

I must admit when Iron Viz was announced for this qualifier I had mixed feelings. On a positive side I know nature and animals, as a photographer and bird watcher I’ve spent a long time gathering data on animals. I even run a website dedicated to sightings in my local area (sadly it’s been neglected for years but still gets used occasionally). On the negative side though I knew I would be faced with competition from a bunch of amazing looking visualisations, using striking imagery and engaging stories.

Did I want to compete against that? With precious time for vizzing lately (especially at end of quarter – you can tell the Tableau Public don’t work in the sales team!) I only wanted to participate in Iron Viz if I could be competitive, and for those who don’t know me I like to be competitive….

So, as you perhaps guessed, my competitive edge won and I pulled some late hours and risked the wrath of my family to get something that did the competition justice.

A Note On Visualisations and creating a Brand

I’ve noted above I expected a lot of pictures and text from people in this qualifier, after all Jonni Walker has created his own brand around animal visualisations, stock photography and black backgrounds. However I have my style of visualisation,  I’m not Jonni Walker, what he does is amazing but there’s only place for so many “Jonni Walker” vizzes. I couldn’t replicate what he does if I tried.

In the past I’ve deliberately combined analytics and design, treading the fine line between best practice and metaphor, staying away from photograph and external embellishments and preferring my visualisations to speak for themselves through their colours and data. The subject this time was tricky though…was it possible to produce an animal visualisation without pictures?

Finding a Subject

I could turn section this into a blog post on it’s own! . I trawled the internet for data and subjects over days and days. Some of the potential subjects :

  • Homing Pigeons (did you know their sense of smell affects their direction)
  • Poo (size of animal to size of poo) – this was my boys’ favourite
  • Eggs (had I found this data I’d have been gazumped: http://vis.sciencemag.org/eggs/)
  • Zebra Migration
  • Sightings Data of Butterflies

Literally I couldn’t find any data to do these enough justice, I was also verging on writing scientific papers at points. I was running out of ideas when I found this website: http://www.naturescalendar.org.uk/ – and I dropped them an email to see if I could use their data. Phenology (studying natures cycles) has always interested me and getting my hands on the data would be fantastic. There was even a tantalising  mention of “measuring the speed of spring” on their news site with some numbers attached but no mention of the methodology….

Now, I’m impatient and so….using a few “dark art” techniques I didn’t wait for a reply and scraped a lot of the data out of their flash map using a combination of tools including Alteryx.

Thankfully a few days later they came back and said I’d be able to use it (after a short discussion) and so all ended well.

Measuring the Speed of Spring

Now I had the data working out how to measure my own “speed of spring” was difficult. Several options presented themselves but all had their drawbacks…the data is crowd-sourced from the public, mainly by people who can be trusted but amateur outliers could affect the result (do you want to say Spring has hit Scotland based on one result). Also the pure number of recorders in the South East, say, could affect any analysis, as could the lack of them in say, Scotland. Given we’re expecting to see Spring move from South to North then that could seriously sway results.

In the end I played between two methods:

  1. A moving average of the centroid of any sightings – and tracking it’s rate of movement
  2. A more complex method involving drawing rings round each sighting and then tracking the overall spread of clusters of sightings across the UK.

In the end I opted for the latter method as the former was really too likely weighted by the numbers of sightings in the south.

Very briefly I’ll outline my methodology built in Alteryx.

  1. Split the country into 10 mile grids and assign sightings to these based on location
  2. Taking each grid position calculate the contribution to the surrounding grids within 50 miles based on a formula: 1-(Distance/50). Where Distance is the distance of the grid from the source grid.
  3. Calculate the overall “heat” (i.e. local and surrounding “adjusted” sightings) in each grid cell
  4. Group together cells based on tiling them into groups dependent on “heat”
  5. Draw polygons based on each set of tiles
  6. Keep the polygon closest to the “average” grouping i.e. ignoring outliers beyond the standard deviation

I then did the above algorithm for each week (assigning all sightings so far this year to the week) and for each event and species.

These polygons are what you see on the first screen in the visualisation and show the spread of the sightings. I picked out the more interesting spreads for the visualisation from the many species and events in the data.

Small Multi Map

The above process was all coded in Alteryx.

Alteryx

If you look closely there’s a blue dot which calls this batch process:

Alteryx Batch

which in turn calls the HeatMap macro. Phew, thank god for Alteryx!

Now to calculate the speed, well rate of change of area if you want to be pedantic! Simple Tableau lookups helped me here as I could export the area from Alteryx and then compare this weeks area to the last. The “overall speed” was then an average of all the weeks (taking artistic licence here but given the overall likely accuracy of the result this approximation was okay in my book).

Iterate, Feedback, Repeat

I won’t go into detail on all the ideas I had with this data for visualisation but the screenshots will show some of what I produced through countless evenings and nights.

 

“Good vizzes don’t happen by themselves they’re crowdsourced”

I couldn’t have produced this visualisation without the help of many. Special mentions go to:

Rob Radburn for endless Direct Messages, a train journey and lots of ideas and feedback.

Dave Kirk for his feedback in person with Elena Hristzova at the Alteryx event midweek and also for the endless DMs.

Lorna Eden put me on the right path when I was feeling lost on Friday night with a great idea about navigating from the front to the back page (I was going to layer each in sections).

Also everyone else mentioned in the credits in the viz for their messages on Twitter DM and via our internal company chat (I’m so lucky to have a great team of Tableau experts to call on as needed).

Difficulties

Getting things to disappear is a nightmare! Any Actions and Containers need to be in astrological alignment….

Concentrating on one story is hard – it took supreme will and effort to concentrate on just one aspect of this data

Size – I need so much space to tell the story, this viz kept expanding to fit it’s different elements. I hope the size fits most screens.

The Result

Click below to see the result:

Final.png

 

Advertisements

On Conference Etiquette and Poor Talks

We’re starting Conference season in my small corner of the data world, with Tableau and Alteryx conference happening simultaneously in London and Vegas respectively. Sadly I’m missing out on my first Alteryx Inspire in a number of years – I hope my friends in Vegas have an amazing time.

As these conferences draw near we’re always treated to an array of advice from seasoned attendees around how to get the most of your experience and so I wanted to add my opinion to this growing pile of tips and tricks. In doing so I want to challenge what seems to be accepted wisdom in conferences I attend among the many bloggers and tweets I follow, the advice goes something like this:

“If you’re not enjoying a talk then walk out and find something else – your time at conference is valuable”

Personally I think this is the worst advice you could be given. Not only is it rude, it also makes a bad situation worse. So let’s show you how to rescue those poor talks and turn them into a positive experience.

1. Choose your talks wisely

Take time to use the conference apps and schedules well in advance of the conference, take the time to research the speakers and topics. If you wish to learn something in particular, or already have some knowledge on the subject, then seek out opinions from peers or the speakers themselves, if you can reach them, on whether your attendance is worthwhile.

Sometimes it’s worth attending a talk not for the content itself but in order to connect with the person afterwards, particularly if they share a common interest or specialism or the same industry.

Whatever the reasons for attending the talk make sure you are clear on them before you walk through the door to attend. Ask yourself (if there are multiple sessions you wish to see) if there are ways to get the same outcome without attending, e.g. arranging to meet the speaker for a 1-to-1 (most speakers are only too flattered to be asked for a coffee to chat through their subject in detail) or watching again online. Try to choose the talk you’d like to ask questions in if the sessions are recorded.

In summary I’d perhaps go as far to say there’s no such thing as a bad session, only poorly chosen ones. You owe yourself, and the speaker, the duty to choose your session carefully.

2. Walking out won’t help

So, if you followed the above advice, you chose, quite deliberately, to come along to this talk. You know why you came and you know what you want to get out of it. You consider the speaker to have something interesting to say, otherwise you wouldn’t be here.

But now the talk isn’t going well. Perhaps the speaker has a voice that belongs on the shipping forecast more than a conference, or perhaps they’re having all sorts of technical problems – those perfect dashboards just won’t render on the conference screens – or maybe they’re nervous and can’t get their words out quite as they intended. Maybe they just didn’t have time to prepare. Maybe they’re reading out their slides to the audience! Whatever the reason walking out is likely to only make a bad situation worse.

Why? Firstly you now have to run across a large conference venue and, if you’re lucky, join your well researched second choice talk rather late. More probably you didn’t have a second choice and so you just run into the nearest room, or your second choice is full and you can’t get in. You might even be forced to just grab a coffee and play pinball. Whatever happens you won’t have the clear outcomes you wanted from your primary choice – and so you’re likely to not find it as valuable (not least because you missed some).

More importantly though what happened to all those reasons for attending the first talk? Did they go away? Of course not, so you’re giving up on a massive opportunity to rescue your original mission.

auditorium, chairs, comfortable

3. Just be Polite

As a speaker I have to say there’s nothing more off-putting than seeing people leave. At conference, in a large venue then really it is to be expected, but many speakers at our data conferences aren’t professional speakers and they’re in relatively small rooms. They’ve given up there time to prepare a talk (which take a lot of effort – more than 99% of attendees have done). The least effort you could put in, having decided to attend, is make a small commitment of all 40-50 minutes of your time.

So make sure you’ve been to the bathroom, make sure you listen and engage with the speaker, try and avoid WhatsApp conversations moaning about the speaker to your friends in other sessions, avoid Facebook for an hour, because you can use the opportunity to potentially turn what could be a wasted 40-50 minutes back into a great learning opportunity.

If you do think you’re prone to voting with your feet then please sit by the door and try to avoid a minimal fuss as you leave. Also remember doors can slam in conference halls – so close doors behind you.

4. Rescuing the Situation

Yes, poor talks happen, as we’ve said, for a variety of reasons, but assuming you’ve decided to stick around then you can rescue the situation and still achieve your original objective for attending the talk.

How do you rescue the situation?

  • Be patient – speakers, particularly customer speakers, are often nervous and so they’ll take a while to loosen up.
  • Think of questions – focus on what the speaker isn’t saying, that’s often the more interesting stuff. How does that tie in with what you wanted to get out the session? Write down a set of questions as the speaker goes through?
  • Ask questions at the end – new speakers will more often than not under-run, leaving plenty of time for questions. This is your chance to really get what you need to know. Tie questions back to what the speaker was saying to show you were listening and ask them to expand on areas of interest to you. Often getting a speaker ad-libbing about something they feel passionate about is where you’ll really start to learn something.
  • Approach the speaker at the end of the talk – as the room empties make sure to say Hi. You could even offer to buy them a coffee if you still haven’t got what you wanted from the talk. Remember you chose this person as an expert in a field you were interested in, one bad presentation doesn’t mean they don’t have something interesting to say.

Prepare well and remember your objectives

In conclusion you owe yourself the duty of preparing well for the talks you want to attend, that preparation will help you focus on what you want to achieve and help you through any sessions that don’t live up to your expectations.

Walking out and leaving poor survey feedback isn’t your only choice, in fact it is likely to be the worst choice you can make. Make the most of the experts the conferences lay on for you and enjoy yourself.

 

 

Data Visualisation: Lonely Hearts Club

2017-03-03_14-00-37
My data visualisation life outside work is missing something. I’m lonely. The hours I spend hunched over the PC visualising data remain unfulfilling. When I’m not “vizzing” the rest of my time is spent on social media networks with other single vizzers. We all pretend we’re happy being single, but deep down I know many of us aren’t. I think it’s important to talk about the loneliness.

You see I’ve spent years now without an audience. At first it was fun, I had the freedom to do what I wanted when I wanted; I didn’t have to worry about pleasing the other half. I spent so many weekends on the equivalent of a boys night out, visualising random datasets, where I splurged out having fun and not really caring about the consequences. Usually I was in the company of lots of other singles and we had a blast. I even had a few meaningless relationships out of those nights, I hope they prepared me for what it’s like to be in a real relationship but I worry they taught me bad habits. After all those nights were all about impressing my mates, not my prospective partner, and so while the results were impressive I’m not sure either of us got any long term value out of the fling.

pexels-photo-247839

Having an audience, so we’re told, is the norm. Articles everywhere tell us how to keep our audience when we’ve found her, but there’s never any clue in them about how to find one in the first place. “Know your audience” everyone says, and every time I hear that a little piece of me dies because I know so many people who don’t have one.

A life in Data Visualisation without an audience is hard. I try my best but I end up vomiting data points and facts onto a page in attempt to make something meaningful. I make them engaging, I add pictures and I try to piece a story together but if I’m honest it’s nothing more than a bit of data porn. Something I know my fellow singles will find entertaining, briefly, but that will be quickly binned as they click on looking for something a little bit more hardcore.

Recently I’ve been attending a few singles nights with the aim of finding a long term partner / audience. Last weekend I was at #OpenDataCamp where I made an appeal for an audience, a user, someone, anyone who I could work with to help solve real issues with visualisation. Yes I know they’d give me problems and challenges but I want to do something meaningful; I think I’m ready for some commitment. Maybe I came across desperate because no one was interested. It was fun, I met plenty of people looking for the same thing as me from a slightly different angle, they had the data but also no audience…some even suggested if I found someone then they could join us in a threesome. I liked the sound of that but perhaps having three in the relationship will only complicate things more….

pexels-photo-165263

Ultimately I guess everyone wants to settle down like me but many of my older friends have settled into the single life as a permanent bachelor. Some of them I never hear from, it’s really sad to see people disappear because they couldn’t find an audience, I wonder where they go? Maybe they found one and never told me…. Others are happy telling others how to have productive relationships without having one themselves. Still others have taken themselves off the market, thrown themselves into work where they can have real relationships, again we don’t see them much anymore. Yes, some of the old timers still join us on boys nights out, but if I’m honest it’s a bit sad seeing them on nights out with the young crowd. I don’t want to be one of them, I want to have a meaningful relationship with someone I can commit to. Wven if it’s just short term I want it to be meaningful. I hope there’s still time, I think I have a lot to offer if I meet the right partner.

If you know anyone who can be my audience let me know, I’d love to meet one and try and work together to create something special.

 

Why we’re going to #opendatacamp

screenshot-2017-02-18-11-42-58On Saturday and Sunday fellow Tableau Zen Master, Rob Radburn and I will be attending Open Data Camp in Cardiff.

So why are we spending a Saturday and Sunday in Cardiff away from our families and spending a small fortune on hotels?

 
Well sometimes data visualisation can be frustrating. We’re both prominent members of the Tableau Community and we’ve spent countless hours producing visualisations for our own projects as well as community initiatives such as Makeover Monday and Iron Viz. There’s lots of fun and rewards for this work, both personally and professionally and so why is it frustrating? Well shouldn’t there be more to data visualisation than just producing a visualisations for consumption on Twitter? How do we do produce something meaningful and useful (long term) through data and visualisations?

Open Data seems a suitable answer however with so many data sets, potential questions and applications it’s hard to know where to start. The open data community have done a great job at securing access to many important datasets but I’ve seen little useful visualisation / applications of open datasets in the UK beyond a few key datasets. How do we do more?

tableau_logo_crop-jpg_resized_460_Tableau Public on the other hand has done a fantastic job of ensuring free access to data visualisation for all, but few in the community have worked with the open data community to enable the delivery of open data through the platform.

Rob and I are hoping that our pitch at Open Data Camp will facilitate a discussion around bridging the gap between the Tableau Community and the Open Data Community. On the one side we have a heap of engaged and talented data viz practitioners on Tableau Public looking for problems, on the other hand a ton of data with people screaming for help understanding it….on the face of it there seems some exciting possibilities, we just need to pick through the .

Oh and while we’re there if anyone wants us to pitch a Tableau Introduction and / or Intro to Data Visualisation we’d be happy to facilitate a discussion around that too.

Would love your thoughts

Chris and Rob

Using Inspect / Javascript to scrape data from visualisations online

My last post talked about making over this visualisation from The Guardian:

2016-11-13_12-55-29

What I haven’t explained is how I found the data. That is what I intend to outline in this post. Learning these skills is very useful if you need to find data for re-visualising data visualisations / tables found online.

The first step with trying to download data for any visualisation online is by looking checking how it is made, it may simply be a graphic (in which case it may be hard unless it is a chart you can unplot using WebPlotDigitiser) but in the case of interactive visualisations they are typically made with javascript unless they are using a bespoke product such as Tableau.

Assuming it is interactive then you can start to explore by using right-click on the image and choose Inspect (in Chrome, other browsers have similar developer tools).

2016-11-13_19-26-35

I was treated with this view:

2016-11-13_19-28-09.png

I don’t know much about coding but this looking like the view is being built by a series of paths. I wonder how it might be doing this? We can find out by digging deeper, let’s visit the Sources tab:

2016-11-13_19-31-30

Our job on this tab is to look for anything unusual outside the typical javascript libraries (you learn these by being curious and looking at lots of sites). The first file gay-rights-united-states looks suspect but as can be seen from the image above it is empty.

Scrolling down, see below, we find there is an embedded file / folder (flat.html) and in that is something new all.js and main.js….

2016-11-13_19-34-05

Investigating all.js reveals nothing much but main.js shows us something very interesting on line 8. JACKPOT! A google sheet containing the full dataset.

2016-11-13_19-38-25

And we can start vizzing! (btw I transposed this for my visualisation to get a column per right).

Advanced Interrogation using Javascript

Now part way through my visualisation I realised I needed to show the text items the Guardian had on their site but these weren’t included in the dataset.

2016-11-13_19-41-27

I decided to check the javascript code to see where this was created to see if I could decipher it, looking through main.js I found this snippet:

function populateHoverBox (type, position){

 var overviewObj = {
 'state' : stateData[position].state
 }
.....
if(stateData[position]['marriage'] != ''){
 overviewObj.marriage = 'key-marriage'
 overviewObj.marriagetext = 'Allows same-sex marriage.'
 } else if(stateData[position]['union'] != '' && stateData[position]['marriageban'] != ''){
 overviewObj.marriage = 'key-marriage-ban'
 overviewObj.marriagetext = 'Allows civil unions; does not allow same-sex marriage.'
 } else if(stateData[position]['union'] != '' ){
 overviewObj.marriage = 'key-union'
 overviewObj.marriagetext = 'Allows civil unions.'
 } else if(stateData[position]['dpartnership'] != '' && stateData[position]['marriageban'] != ''){
 overviewObj.marriage = 'key-marriage-ban'
 overviewObj.marriagetext = 'Allows domestic partnerships; does not allow same-sex marriage.'
 } else if(stateData[position]['dpartnership'] != ''){
 overviewObj.marriage = 'key-union'
 overviewObj.marriagetext = 'Allows domestic partnerships.'
 } else if (stateData[position]['marriageban'] != ''){
 overviewObj.marriage = 'key-ban'
 overviewObj.marriagetext = 'Same-sex marriage is illegal or banned.'
 } else {
 overviewObj.marriagetext = 'No action taken.'
 overviewObj.marriage = 'key-none'
 }

…and it continued for another 100 odd lines of code. This wasn’t going to be as easy as I hoped. Any other options? Well what if I could extract the contents of the overviewObj. Could I write this out to a file?

I tried a “Watch” using the develop tools but the variable went out of scope each time I hovered, so that wouldn’t be useful. I’d therefore try saving the flat.html locally and try outputting a file with the contents to my local drive….

As I say I’m no coder (but perhaps more comfortable than some) and so I googled (and googled) and eventually stumbled on this post

http://stackoverflow.com/questions/16376161/javascript-set-file-in-download

I therefore added the function to my local main.js and added a line in the populateHoverBox function….okay so maybe I can code a tiny bit….

var str = JSON.stringify(overviewObj);
 
download(str, stateData[position].state + '.txt', 'text/plain');

In theory this should serialise the overviewObj to a string (according to google!) and then download the resulting data to a file called <State>.txt

Now for the test…..

downloadingfiles

BOOM, BOOM and BOOM again!

Each file is a JSON file

2016-11-13_20-07-21

Now to copy the files out from the downloads folder, remove any duplicates, and combine using Alteryx.

2016-11-13_20-04-59

As you can see using the wildcard input of the resulting json file and a transpose was simple.

2016-11-13_20-08-31

Finally to combine with the google sheet (called “Extract” below) and the hexmap data (Sheet 1) in Tableau…..

2016-11-13_20-09-41

Not the most straightforward data extract I’ve done but I thought it was useful blogging about so others could see that extracting data from visualisation online is possible.

You can see the resulting visualisation my previous post.

Conclusion

No one taught me this method, and I have never been taught how to code. The techniques described here are simply the result of continuous curiosity and exploration of how interactive tables and visualisations are built.

I have used similar techniques in other places to extract data visualisations, but no two methods are the same, nor can a generic tutorial be written. Simply have curiosity and patience and explore everything.

 

Combining Multiple Hexmaps using Segments

After my #Data16 talk Chad Skelton challenged me to do a simple remake of the Guardian sunburst-type visualisation that I critiqued in my Sealed with a KISS talk (which you can now watch live at this link).

The original visualisation is show below:

2016-11-13_12-55-29.png

While initially engaging, I find this view complex to read and extracting any useful information involves several round trips to the legend. The circular format makes the visualisation appealing while sacrificing simple comprehension. Could I do better though?

Chad suggested small multiple maps and I agreed this might be the simplest approach but I was not happy with the resulting maps:

2016-11-13_18-22-51

 

Alaska and Hawaii why do you ruin my maps? The Data Duo have several solutions and my favourite is the tile map.

Thankfully Zen Master Matt Chambers has made Tile Maps very easy in this post and so I followed the instructions, joining the Excel file he provided onto my data and giving a much more visually appealing and informative result. The resulting visualisation is below (click for an interactive version):

cxjnfitxgae5mkn

However I still wasn’t satisfied with this visualisation, it has several problems:

  • it separates out the variables per state, meaning the viewer till has a lot of work to do to compare each states full rights.
  • it still requires the use of the legend to fully understand
  • the hover action reveals extra info meaning the users has to drag around to reveal the story
  • the legend is squashed due to space

How to solve these issues? I spent a while pondering it and eventually I found a possible answer: I could use a single map but split each hexagon into segments (ignoring marriage as it is allowed in all states – another solution woudl have been to cut out a dot in the middle for the seventh segment).

To do this I’d need to split up each Hexagon into segments, therefore I took out my drawing package and created six shapes:

These six shapes have transparent backgrounds and, importantly, when combined create a single hexagon.

Now with these shapes I can use a dimension (such as Group below) on shape, and then use colour to combine each hegaxon into different segment colours on the map (using Matt’s method and data for Hex positions).

2016-11-13_18-41-23.png

Using this technique I therefore created the visualisation below (click for interactive version):

2016-11-13_15-58-05

Using this method it would be possible to combine 3, 6, 9 or 12 (or possibly more) dimensions on a single map by segmenting the hexagons. Similarly using a circle in the middle would allow 4 or 7 dimensions.

I’m not sure how applicable this type of method is to other visualisations but please let me know if you use it as I’d love to see some more examples.

MM Week 44: Scottish Index of Multiple Deprivation

This weeks Makeover Monday (week 44) focuses on the Scottish Index of Multiple Deprivation.

2016-10-30_21-54-04

Barcode charts like this can be useful for seeing small patterns in Data but the visualisation has some issues.

What works well

  • It shows all the data in a single view with no clicking / interaction
  • Density of lines shows where most areas lie e.g. Glasgow and North Lanarkshire can quickly be seen as having lots of areas among the most deprived
  • It is simple and eye catching

What does work as well

  • No indication of population in each area
  • Areas tend to blur together
  • It may be overly simple for the audience

In my first attempt to solve these problems I addressed the second problem above using a jitter (using the random() function)

2016-10-30_22-05-26

However it still didn’t address the population issue and given the vast majority of points had similar population with a few outliers (see below) I wondered whether to even address the issue.

2016-10-30_22-08-40

Then I realised I could perhaps go back to the original and simply expand on it with a box plot (adding a sort for clarity):

2016-10-30_22-16-23.jpg

Voila, a simple makeover that improves the original and adds meaning and understanding while staying true to the aims of the original. Time for dinner.

Done and dusted…wasn’t I? If I had any sense I would be but I wanted to find out more about the population of each area. Were the more populated areas also the more deprived?

There have been multiple discussions this week on Twitter about people stepping beyond what Makeover Monday is was intended to be about. However there was story to tell here and I dwelled on it over dinner and, with the recent debates about the aims of Makeover Monday (and data visualisation generally), swirling in my head I wondered what I should do.

I wondered about the rights and wrongs of continuing with a more complex visualisation, should finish here and show how simple Makeover Monday can be? Or should I satisfy my natural curiosity and investigate a chart that, while perhaps more complex, might show ways of presenting data that others hadn’t considered….

I had the data bug and I wanted to tell a story even if it meant diving a bit deeper and perhaps breaking the “rules” of Makeover Monday and spending longer on the visualisation. I caved in and went beyond a simple makeover….sorry Andy K.

Perhaps a scatter plot might work best focusing at the median deprivation of a given area (most deprived at the top by reversing the Rank axis):

2016-10-30_22-11-22

 

Meh, it’s simple but hides a lot of the detail. I added each Data Area and it got too messy as a scatter – but how about a Pareto type chart…

2016-10-30_22-23-23.jpg

So we can see from the running sum of population (ordered by the most deprived areas first) that lots of people live in deprived areas in Glasgow, but we also see the shape of the other lines is lost given so many people live in Glasgow.

So I added a secondary percent of total, not too complex….this is still within the Desktop II course for Tableau.

2016-10-30_22-26-03.jpg

Now we were getting somewhere. I can see from the shape of the line whether areas have high proportions of more or less deprived people. Time to add some annotation and explanation….as well as focus on the original 15% most deprived as in the original.

Click on the image below to go to the interactive version. This took me around 3 hours to build following some experimenting with commenting and drop lines that took me down blind (but fun) alleys before I wound back to this.

2016-10-30_21-51-53

Conclusion

Makeover Monday is good fun, I happened to have a bit more time tonight and I got the data bug. I could have produced the slightly improved visualisation and stuck with it, but that’s not how storytelling goes. We see different angles and viewpoints, constraining myself to too narrow a viewpoint felt like I was ignoring an itch that just needed scratching.

I’m glad I scratched it. I’m happy with my visualisation but I offer the following critique:

What works well:

  • it’s more engaging than the original, while it is more complex I hope the annotations offer enough detail to help draw the viewer in and get them exploring.
  • the purple labels show the user the legend at the same time as describing the data.
  • there is a story for the user to explore as they click, pop-up text adds extra details.
  • it adds context about population within areas.

What doesn’t work well:

  • the user is required to explore with clicks rather than simply scanning the image – a small concession given the improvement in engagement I hope I have made.
  • the visualisation take some understanding, percent of total cumulative population is a hard concept that many of the public simply won’t understand. The audience for this visualisation is therefore slightly more academic than the original. Would I say this is suitable for publishing on the original site? On balance I probably would say it was. The original website is text / table heavy and clearly intended for researchers not the public and therefore the audience can be expected to be willing to take longer to understand the detail.

Comment and critique welcomed and encouraged please.