Can Microsoft democratise AI? The new code-free Power BI integrations with Azure Cognitive Services and Azure Machine Learning are a big step along that road.

Modern businesses run on information, but we’re drowning in data from line-of-business systems, company databases, the data generated by industrial IoT systems, and all manner of external data. So how can we stay on top of the data we have, in order to get the business insights we need, when good data scientists are rare and expensive to employ?

Modern data analysis tools like Tableau and Power BI go a long way to solving some of these problems, with graphical tools that simplify building queries and displaying results. Building on the analytical tooling delivered in Excel, Microsoft’s Power BI can work across data sets, constructing and testing queries, and comes with a customisable set of visualisations.

Introducing ‘pragmatic AI’

Microsoft’s Steve Guggenheimer, in a blog post on the subject of what he calls real-world or ‘pragmatic AI’, suggests that data is the foundation for AI, and without data there is no AI. He goes on to note that there’s a need for insight before intelligence, raising the concept of ‘BI before AI’. That’s where Microsoft’s business application platform comes in, centred around its Common Data Model (CDM) and a shared graph for business data entities.

power-bi-trcdm.png

Microsoft’s Common Data Model (CDM) is a standardized, modular and extensible collection of data schemas. It consists of entities, attributes, semantic metadata and relationships.

Image: Microsoft

Building on the Dynamics CRM and ERP data model, the CDM mixes horizontal data, covering common business concepts, with vertical industry-specific data. This is an approach that works well with analysis tooling like Power BI, allowing you to explore your data looking for insights that can be used to build new machine-learning (ML) models that can be included in your applications. With an interactive approach, and access to prebuilt ML tooling, Power BI can become a way of building ML models without having to write complex code.

SEE: The Power of IoT and Big Data (Tech Pro Research)

After all, not everyone can program in R or Python — the two main analytical programming languages used by most ML systems. However there’s one concept that they have in common with Power BI: the use of shared notebooks to explore data and display results. Power BI’s reporting tools can be compared to data science’s Jupyter Notebooks — a shared sandbox where teams can explore data and tweak models before using them in larger-scale applications.

Giving Power BI access to large amounts of data

A typical modern business intelligence (BI) system takes data from a data lake and feeds it into a self-service BI tool like Power BI. However, for more responsive operations, Power BI includes a more flexible option in its dataflows. The typical system requires development time, constructing Extract-Transform-Load (ETL) tooling to deliver the data lake. With dataflows you’re using familiar query-building techniques to construct reusable data entities without knowing anything about the underlying technologies.

power-bi-trdataflows.png

Power BI’s dataflows help organizations unify data from disparate sources and prepare it for modelling. Dataflows can be created using familiar self-service tools and are used to ingest, transform, integrate and enrich big data.

Image: Microsoft

Self-service data preparation without having to program ETL systems takes much of the work out of business analysis. There’s no need to wait for ETL specialists to build and test an ETL pipeline; all you need to do is define your dataflow and test the resulting entities. If it doesn’t work, you go back and build a new one. You can also share built and tested dataflows with colleagues, democratising the development of business analysis tools. You don’t need to write any code, as it’s all handled by familiar Power BI tools.

Large-scale queries can take advantage of the Azure Data Explorer, which now offers Power BI integration. Data Explorer is for working with large amounts of data in near real time, so you can use it to explore log files or other sources of large amounts of data. For example, a sample Power BI analysis of GitHub public data demonstrates working with over a billion pieces of data.

Adding machine learning to Power BI

Microsoft has recently added the option of using Azure Cognitive Services in Power BI. Instead of writing queries to visualise and explore data, you can integrate Cognitive Services to handle complex responses using pre-built machine-learning systems. By using one of Microsoft’s growing number of machine-learning services, you can quickly extract relevant data from your data lake and from external sources. Perhaps you’re sampling Twitter in real time to look for references to your business; by taking that data and using a sentiment analysis model from Azure Cognitive Services, you can detect positive or negative sentiment and display it in a Power BI dashboard.

SEE: Job description: Data scientist (Tech Pro Research)

By combining this approach with either dataflows or Azure Data Explorer, you can quickly build intelligent dashboards across your business. Mixing BI with AI makes a lot of sense, as it’s where you have the large amounts of data that are necessary to both train models and get significant results. Using tools like Azure Cognitive Services normally requires building applications and working with Azure APIs. By integrating it with a desktop business analytics tool, Microsoft is removing the developer from the equation and putting its machine-learning tools in the hands of business users.

power-bi-trazure-ml.png

Now you can create your own machine-learning models in Power BI.

Image: Microsoft

The next step is to go beyond pre-trained, pre-built models, and use Power BI and your data warehouses to build and train your own machine-learning models. Microsoft has added a new Power BI workflow that helps you choose an appropriate model for your application, before selecting training data and then training the model to handle your specific business issues. The resulting model will be available through Azure Machine Learning, and can be shared with colleagues and built into either desktop or cloud applications.

Businesses can quickly build libraries of machine-learning models on Azure, and Power BI offers an alternate code-free method of consuming them in your applications. Your Power BI application will scan the available models, and automatically generate a user interface that lets you make them a drag-and-drop component in your BI applications. Again, there’s no need to write any code, and there’s no need to bring in data science expertise. If you have access to a model, you can use it in your reports or your dashboards.

Microsoft has done a lot to make it easy to use machine learning in business applications, with Windows libraries and with RESTful APIs. Integrating it into Power BI takes things a lot further, dropping code in favour of drag-and-drop and wizards. If we’re to make the most of AI in businesses, then making it available to the users who need to query their data has to be the way to go. Power BI has become an important desktop business tool; adding intelligence should make it essential.

Source: No-code machine learning in Power BI

ThirdEye Data

Transforming Enterprises with
Data & AI Services & Solutions.

ThirdEye delivers Data and AI services & solutions for enterprises worldwide by
leveraging state-of-the-art Data & AI technologies.

Talk to ThirdEye