Gartner BI & Analytics Conference – Modern Architecture

In an excellent session this afternoon, we were taken on the journey towards best practice implementation of BI/Analytics architecture. For the last 2 years Gartner have recommended that the 3 key areas are represented by the Information Portal, Analytics Workbench and Data Science Laboratory. Highlighted in the rather shaky picture below:

3 Tiers BI Arc

Each tier will offer different benefits to the business that can be summarised by the key roles and processes that a modern architecture need to accommodate. These are summarised in the images  below:



Finally by following this you could end up with the holy grail, if this is your businesses holy grail:


It was interesting in this session how they talked around vendors and again reiterated that no one single tool OR vendor can deliver this whole picture. However, they also pointed out that the magic quadrants should not be looked at in isolation as niche and smaller vendors that are in the lower left or not even quite on the board may well suit your businesses needs really well. Understanding your business needs, values and potential outcomes AGAIN seems the ultimate place to “bet your house” when it comes to delivering a successful BI/Analytics program. I would also quote Neil Chandler for businesses just venturing out into BI in that this will be a core competency of your business going forward and require “indefinite investment”. Don’t let this scare the financiers but do make sure people realise that the delivery of the information portal or a specific tool to do data discovery is not the end of the BI/Analytics journey. I much prefer Bibby’s Lead Architect, Richard Smith’s comments around this which has our internal program focussed on creating “an enduring BI capability” to be enduring you must react to change, within the business and within the marketplace.

Gartner BI & Analytics Summit London – Day 1

What a great day, very exciting and lots learnt from industry experts, with very little bias towards a vendor or specific technology. This post will form the first of two which is my attempt to mind dump some of the key takeaways I have picked up and found useful here at the conference.

BICC – Business Intelligence Competency Centre

Is dead… long live the ACE – Analytics Community of Excellence. In the keynote Neil Chandler suggested four things wrong with the BICC, business, intelligence, competency and centre! Although it is still the driver behind successful, versus non-successful, BI programs it is not encompassing enough of the modern world of BI and Analytics. Key benefits of this approach is an attempt to drive BI programs from business outcomes but mostly it fails and still becomes an efficiency drive, not linked to delivering actual business value. It also does not encompass a new wave of change in the business, self-service, which is near impossible to centrally manage. Finally it has not been driven around the new future of analytical applications which are focussed around algorithms and a scientific approach to running of businesses, and our lives!

Whilst there is a lot in words the evolution from BICC to ACE is not just about words it is about evolving and improving a good concept and bringing it up-to-date and inline with business needs. Analytics now seems to embrace BI and gives us a larger maturity scale for businesses in our ever changing world. Community takes away the need to centrally control and helps with the, already in-place, self-service. Finally excellence is about striving towards something we perceive as the ultimate goal, not just a list of competencies that are based around technology.

One thing that hasn’t changed but cannot be ignored is that you MUST FOCUS around BUSINESS OUTCOMES to achieve excellence in your BI or Analytics program.

Algortithms are KEY

An important key theme for the keynote centred around algorithms and their use in BI and Analytics as well as in day to day life. Guessing which classical piece of music was generated by a computer or Bach highlighted how compute power and science has progressed and there is a real feeling from Gartner that through the use of algorithms and the tools that support them we can automate, improve and gain valuable insight into our businesses. Algorithms are used all over your business today, take some time to document them and look to find tooling to support their automation and improvement. Citizen data science communities may spring up around this.

Other key notes:

  • IoT algorithms are set to generate $15 billion by 2018
  • Over half of organisations will be leveraging algorithms by 2018
  • By 2020 50% of Analytics leaders will be able to link their programs to real business value.
  • The best analytics leaders can formulate new questions as well as answer existing business ones. They also fail, learn and push the envelope; I would add they fail fast, time is gone for multi-year BI programs.
  • Data Management and Data Integration tools are converging, but not as fast as you may imagine.
  • BI and Analytics tools are also converging but it was noted NO one vendor can support all BI and Analytics needs and the buyers of each are currently different.
  • Quadrant analysts highlight IBM’s Watson Analytics as a good example of an analytical application.
  • Microsoft’s Power BI v1 (that horrible O365, SharePoint online linked tool) failed, but v2 has gained traction and is having a negative impact on Tableau’s performance.

Still a lot of learning to go, even on day 1 but I wanted to share this for those not fortunate enough to attend this amazing event!


Choosing the Right BI Partner

Having worked as a Senior Consultant for Thorogood Associates and Principal Consultant at Coeo, I feel very qualified to write this guide to help you when selecting a new BI partner.  I worked as a consultant for a combination of about 6 years and have worked in and around BI for over 15 years. During my consultancy time I have worked worth with over 100 companies and among the engagements around 20 green field projects. The majority of these companies selected to work with us because of a vendor recommendation or a personal connection with existing staff. The majority of these companies engaged before ever meeting the consultants that would be working with them or ever investigating the suitability of the partner. This is not to say that the engagements didn’t work, they often did, but I have often questioned customers selection criteria and wondered if there were not cheaper/better ways for them to approach their BI needs. For example would you buy a new car without a test drive first? Would you employ a BI developer or architect without an interview? If the answer is no to both then this blog may be of some use to  you. I now sit on the “other side of the fence” and I think there are many things worth considering before signing up with a BI partner, and a test drive might well be one of those!

Top 5 things to consider when selecting your BI partner:

1. Are they a certified partner to your chosen, or preferred technology vendor(s)?

There is little point talking with non-certified partners, but understanding the partner process your potential vendor uses is also important. For example anyone and everyone can become a Microsoft partner, easily. But to become a silver or gold BI partner requires a lot of important steps to be completed by the vendor.Question the website graphics and check vendor lists for up-to-date information and the steps required to achieve the advertised status.

If you havent decided on a technology or vendor then make sure your potential BI partner is certified to multiple vendors or none at all with experience of vendor selection processes. Alternatively, and ideally, select two potential partners per potential vendor and make this part of your vendor selection process.

Beware vendor recommendations as the engagement, at this point, with vendors is sales led. Are they the best people to recommend a partner to you?

2. What are their staff credentials, experience (ideally in your industry), and certifications?

This should be current, an MCDBA in SQL 2005 is not relevant. It should also be relevant to your requirements, Prince 2 certifications when you will provide your own project management matters little. Utilise LinkedIn as it is a fantastic way to validate some of the key people at your potential partner. Very early on in the process ask about the BI team and do some research, later in the process meet these people! Investigate industry experts in BI and often talking to them in open groups on Twitter or LinkedIn, can help get recommendations or confirm your thoughts on the potential partners. Look to relevant industry events and look for sponsorship and lead technical experts sharing their knowledge with everyone. Finally asking to speak to similar companies they have worked with and/or researching the case studies is also a valid and useful approach.

Ultimately treat this review as you would an interview for BI specialists. Most companies, around 90%, I have worked with all, ultimately, want to create their own internal BI competancy. To that end your choice of partner is about extending or even starting your own BI team for your company. Finding a partner that has experience doing this and have approaches for aiding this is also important. Meeting with, interviewing and validating these people is just common sense.

3. Does their engagement model fit to your requirements?

In my opinion it should be flexible, not cookie cutter BS that you could probably find on slideshare. Critically you should know your requirements and, if knowledge transfer is important, then how does the partner manage this or build it into their process. Does your business have ever changing requirements and priorities, can this partner support agile, and I dont mean they do stand ups and split work into sprints, true agile is not not just about this and validating their agile certifications is also important, if, agile is important to your business/department.

Do they force project management upon you? Do they charge for this or is it wrapped up as part of the price, day rate etc… Not that PM is bad, I think it hugely important and useful on projects. All partners should be quizzed about how they deliver the capabilities a normal PM function would bring. BUT this should be inline with your expecations and preferred way of working.

4. Do they have recent, reference-able and relevant examples of working on these kind of projects before?

This should include any key areas important to your business, i.e Master Data or data quality. Perhaps similar experience in helping transitioning you to your own BI team, in the fullness of time. Can you talk with their previous customers? If yes then I like to talk to the business analyst or BI team lead, rather than a PM. Talking to a key business user is also interesting if your choice is linked to a specific vendor.

5. How do they manage risk?

This is a broad area and difficult to gauge. But there are some key areas to think about during your selection process. Key man dependency – how does the partner make sure that if their star consultant leaves how does that affect your project and deadlines? Changing requirements – what are the processes and how do they manage the reality of changing requirements – no BI project EVER fits a static, waterfall approach. If your partners talk about being able to manage this but at extra cost, close the conversation immediately. These are best practice areas that all partners should include as part of their general way of working.

Apart from these areas to consider, what’s absolutely key is understanding what you want to deliver for your business and what the potential return or value is to that business. If at least one of these things are not clear then it is not time to engage with a third party, unless, perhaps they have good experience with these stages of work and can provide management consultancy to help build that initial business case; in my experience not many can do this from scratch and you probably need a less technical partner than the typical BI partners. But I don’t mean you have to have a completed business case or a full technical specification but you need some idea and definitely buy in from your business stakeholders.


Up front it is more than ok to ask for rate cards. If the partner doesnt work that way then ask for example costs of previous projects with similar requirements and deadlines; or ask for their average rate in their last financial year, or last two BI projects, trust me they ALL know this information. It is critical to make sure, for both you and the partner, that you are in the right ballpark right from the off. At an early stage it is completely ok for these costs to be indicative and you shouldnt start budgeting around these numbers, all good proposal stages require a more accurate assesment of need to identify more accurate costs. At an early stage it is also important to think about the cost approach, time and materials or a fixed approach. In my experience an indicative/umbrella budget with a T&M proposal gives you the best price and is the most flexible way to work with a partner. Fixed prices can work but due to the risk and nature of these engagements expect paying for a hefty upfront design stage and the partner (rightly) adding a % on top of normal costs to mitigate potential slippage/risk.

Top 3 Actions to complete whilst selecting the right partner:

1. Meet the team – and not just you, include key members of your internal tech stakeholders and potential internal delivery team members. DON’T just meet sales, or technical pre-sales.

2. See working examples of their work, relevant to yours – ideally they may want to run a mini POC with your data (perhaps with vendor supported funding)

3. Constantly review your requirements, for the project and business, against what you are learning about the partner.  Have your internal stakeholders be part of these reviews.


This process should take as long as it needs, but match the length of time with size of the project and complexity of your requirements. I have found that evn a small piece of work this process can take as long as a multiple month green field project. Critically the time this process takes is up to you and should NEVER be dictated by your potential partners. Certainly question the lead time required by the potential partner to be available to start initiation or development kick off BUT dont be driven by this. Your project can wait a month for the right partner, it has probably waited a lot longer prior to this!

In Summary…

This process is a difficult one filled with the usual mix of technical review, emotional and emphatic feelings. The more people you get involved in this process, at relevant times, the better. If you are forming a BICC have your most important business stakeholders part of the process. If you or someone in your IT team knows of a good BI person (validate this) then have them talk about experiences and recommendations for BI partners. Finally I have seen matching your company size, or project size with a partner size. IMG Group (now Hitachi) always seem to work well with larger companies, more mature in BI, than say Coeo who worked much better with less BI mature companies, often of a smaller size.

I do have a spreadsheet that I used recently for vendor selection which I am more than happy to share, please comment with your details and I will email it over. I am also happy to help, where I can with my experience. Good BI people are close knit bunch in the UK and I am more than happy to share my experience and recommendations, for what it is worth to you!🙂

And as always thank you for reading, I hope this is of some help!

Microsoft BI – 2015 Highlights

It’s been a great year for BI! Power BI coming of age,  exciting SQL Server 2016 CTP releases and a maturity in the cloud for analytics, data science and big data.

For me Power BI is the biggest news of 2015. POCs ran in H1 of 2015 found it wanting. Basic functionality missing and the confusion of wrapping it in Office 365 made it to much for businesses to consider. However with the GA release, and the numerous updates, it had finally delivered on its vision and given Microsoft an end to end, enterprise solution, for the first time in its history; including multidimensional connectivity!

Microsoft also made some great tactical manoeuvres including the purchase of Datazen and Revolution R as well as their excellent Data Culture series. Datazen is a good tool in its own right with great dashboard creation capability and impressive mobile delivery functionality on all devices/platforms. It will nicely integrate to SSRS top deliver a modern reporting experience via mobile in SQL 2016. R is the buzz of 2015, a great statistical analysis tool that will really enhance SQL Server as the platform of choice for analytics as well as RDBMS. In fact you can already leverage is capability in Power BI today!

Cloud. So Microsoft finally realised that trying to drag businesses into the cloud was not the correct strategy. A hybrid approach is what is required. Give businesses the best of both worlds. Allowing them to benefit from their existing investments but “burst” into the cloud either for scale or new capability, as yet untested. SQL 2014’s ability to store some data files, perhaps old data purely kept for compliance,  is a great example of this. ExpressRoutes ability to offer a fast way to connect on-premises with cloud is brilliant. Or go experiment with Machine Learning, made Microsoft simple by the Azure offering.

For me I was also scored to see the PDW hot the cloud with Azure SQL Data Warehouse. An MVP platform is the closest my customers have needed to be to BigData but the initial outlay of circa half a million quid was a bit steep. With the cloud offering companies get all the benefits worn a minimal investment and an infinite ability to scale. But do consider speed of making data available as it could be limited by Internet connections.

So in summary an awesome year for Microsoft BI with the future looking great! I still feel Microsoft lack SSAS in the cloud but perhaps Power BI will gain that scale in 2016. Overall I envisage seeing Microsoft as a strong leader in the next Gartner quadrant release for BI and I can’t wait for SQL 2016’s full release!

The future (2016 at least) is bright, the future is hybrid cloud…


MS BI Current World

Minimum Viable Nap Time – Agile Parenting

Anyone involved in scrum would have heard of minimal viable product (MVP). It is basically the most important features of your deliverable, the key features required to make your output a success!

Since becoming a Scrum Master I have often found great ways of applying Scrum, or Agile, to real life not just work.  For example instead of planning for our holiday in a waterfall fashion we break the whole process into “sprints” and accept that things will change,  like our requirements or budget throughout the year.

However,  I hadn’t thought about it in terms of parenting so far. But yesterday, as we were hightailing across to my sisters house for my nephews first birthday party I realised that my little one,  15 months old, runs in sprint cycles, requires sprint planning and definitely has MVP.

Let’s take the sprints, and now she is racing around they truly are sprints! We plan the sprints as best we can. The morning sprint is agreed the night before and can last up to 8 hours. Our planning,  our refers to the delivery team,  me and the wife, lasts just 15 minutes. We agree that if little one wakes earlier than 7 we will let her play in her cot until 7. We plan breakfast, what she will have and when she will have it. Part of our planning is a sprint review of the previous morning sprint, what did she enjoy to eat, did she play in her cot etc. On a daily basis we know we need to meet her MVP: food, nappy change, learning, naps and love. Naps are very important and in our house we believe sleep begets sleep!

As we headed out yesterday our sprint goal for the mornings sprint had changed, we still had to meet little ones MVP but also get across to Milton Keynes for 11 am for the party. To do this the first nap of the day had to be in the car! This means playing George Ezra’s Budapest on repeat until she dozes off. Then it is about no talking and me trying to avoid pot holes and take roundabouts at 10 miles an hour!  This is maintained until the MVNT is met,  30 minutes. Any disturbances, mum sneezing, daddy ov&ertaking, are met with more Budapest replays;  I really hate this song!

For all parents naps are key and we all know our little ones MVNT, Minimum Viable Nap Time. If you already know Scrum I definitely suggest implementing a bit into your patenting, it definitely works. But my advice is not doing the retrospective when the wife has done 3 straight nights of no sleep whilst you slept like a baby in the Crown Plaza in Leeds! If you don’t know Scrum but do want to get more Agile then I heartily recommend it for work and home! The best training I have had is a company called Agil8 and a chap called David Hicks.

Datazen and Windows 7

Are you using Windows 7? Stuck on it for the foreseeable? IT will NOT let you be part of the test group for Windows 8.1 or 10? Then consider carefully any decision to utilise Microsoft Datazen.

Datazen is a great, simple, dashboarding and visualisation tool that is available as part of your SQL Server Enterprise, Software Assurance, agreement. It is a relatively simple tool which offers brilliant mobile delivery via iOS, Android and Windows. Datazen has connectors for lots of sources including Analysis Services. Client access is FREE and there is no cloud involvement, unless you host the Datazen server in Azure, but even then you could configure it so no data is persisted in the cloud!

My first customer who is using Datazen and Windows 7 called me in last week to help troubleshoot some potential show stopping issues they are having with the tool. The issues are to do with the Windows 7 Publisher application and creating, publishing and editing dashboards with Windows Authentication to a standard SQL Server.

The Windows 7 application is in preview: and available to download from Microsoft. The “preview” tag is an interesting one as you may think that this would have been done ages ago…

However, after digging around the history of Datazen, it is clear that this release was an afterthought following the products acquisition by Microsoft in April 2015. The original product was only released in 2013 and was the baby of a team of people that previously gave us ComponentArt. According to their background over 40k people have been using the tool since its release and both Gartner and Forrester mentioned the product. However, I was not alone in the BI community, that had never heard of it.

Being released in 2013 means they definitely didn’t think about designing a front end to work with Windows 7. In fact, by the time of Datazen’s published release date Windows 8 had been in general release for over a year. So there is no way this product was built to work with Windows 7.

But back to the issue. My customer was really excited about the ability to have a BI tool that would enable them to create rich visualisations that could be accessed by up to a 1000 users for free! If you ignore the cost of a SQL enterprise license and appropriate SA! They sensibly set up a test server (two actually but the server design can be discussed at another time) and got the Windows 7 client installed on the MI team’s laptops. Their data source is a very simple SQL Server data set storing 100s of rows of summarised data.

The MI got about the relatively easy business of creating dashboards. However, the next day they tried to edit dashboards themselves (from the server, not the local copies) or other users tried to just view the dashboards, using the Windows 7 application, and they wouldn’t open. Even local dashboards couldn’t be re-published to the server.

The led to some serious concerns that they had made the right decision to use this tool. It seems the Datazen, Windows 7, application loses the connection to the Datazen server, sporadically. This can be spotted by the failure to be able to publish or by a small icon (apologies no screenshots as I don’t have a Windows 7 VM to recreate locally) under the connection name that shows the BI hub on that Datazen server. By removing and then adding the connection again the MI team are able to publish.

We also found that this wasn’t a network or security issue as if the same users browsed (using Chrome or IE) to the Datazen server they are able to view the dashboards that simply wouldn’t open in the Windows 7 application.

So we have a temporary workaround. Keep re-creating the connection in the Windows 7 application and use the browser to actually view dashboards. Luckily there are NO end user issues as they will NOT be using the Windows 7 application to view, they will be using iOS app or direct through browsers.

Finally, we did manage to test some scenarios using a spare Microsoft Surface that was running Windows 8.1. There were no issues!

In summary you should be wary of using this product with Windows 7 and be mindful of the fact that the product wasn’t built for Windows 7 and for the best experience you do need to be on a later version of Windows. This shouldn’t detract from Datazen being a fantastic option for a BI tool. It is free and doesn’t touch the cloud, something that a lot of my customers are very excited about!

Just to note we are raising the issues with Microsoft, but at the moment it is not clear if there are plans to do a full, non-preview, release of the Windows 7 application; given that Windows 7 is still the most used OS I hope so! I will keep you updated.

Building a Star Schema in Power BI

So it has been a while since the GA release of and Power BI Desktop Edition (for those that remember ProClarity and Panorama Desktop tools this choice of name still makes me giggle!) and I thought it about time I put the modelling capabilities of Power BI Desktop to the test. I am a strong believer that the data model and is the MOST important part of any BI delivery. Without it we will get different answers on the same question and not have a consistent experience for end users or analysts.

Power BI purports to be a one stop shop for data blending, modelling and presentation. I have no doubt in its presentation capabilities, the dashboards look and feel great and with the GA release the ability to do basic things such as add images and customise colour schemes make it a good competitor at the front end space of your BI stack. I am also confident that by integrating Power Query connectors and transformation capabilities into the tool that basic ETL and data blending is going to be no problem. In fact I would say it is the best way available, certainly most cost effective way, to bring together cloud and on-premise data into a single place. However modelling just feels wrong in a front end tool! To put it to the test I had some basic Coeo sales data that I wanted to play around with and try and model in a way that I could deliver a monthly sales dashboard.

I found many useful things that really helped with modelling including: adding a column as a new named query, de-duping this list and then converting to a table. Features like the group by, or replacing values, ie nulls, with No Service or Unknown allow us to achieve a basic star schema! However, when I needed to generate a fact table based on a different grain I am not ashamed to say I had to go back into SQL Server and using tables and T-SQL transform, match and load the data into my new schema. This gave me the quickest and easiest way to then build my sales dashboard!

I am going to blog in more detail about some of the fun features I found along the way around modelling, dax measures, date dimension funnies and also about how we are going to build the architecture to support the Coeo Data Warehouse in the cloud utilising Azure Data Factory, SQL DWH, Power BI and Datazen! So watch this space for more! In the mean time below are the steps I used to create my basic star schema!

1. Starting in Power BI Desktop with a flat extract of data from our CRM system:



2. Now Create a dimension, e.g. Company!


3. Remove Duplicates, as it is a dimension table and we only want unique values:


4. Now convert it to a table, we can do more with it then!


5. Such as rename the column to something meaningful!


6. Now simply close and load the queries to the model:

close and load

And we have a mini star schema! Simply repeat for all the dimensions you want to generate. It also works nicely for date dimensions just remember to set the correct data types, you can even add columns to form a mini date dimension!