The smartest way to make a big world seem small

Business Intelligence
Analytics

Page Two
Part 1 of 4

Not a problem that doesn’t affect someone somewhere, right?

Further detail in closing shows that the aforementioned problems can be encapsulated together by moving between a micro and a macro view.

Problems within an organizational unit won’t resolve itself until:

(a) the organization chooses to address the problem for an external stakeholder

(b) the effects of the problem outweigh external factors the organization is experiencing.

The seemingly impossible situation that occurs when an organization quantifies something that “doesn’t need fixing” is an outlandish inflection-point. It’s a means to an end. Gazing at stars would be more productive than trying to explain it to someone. The Business intelligence tip to jump ahead is that high-quality data needs to be generated. Responsibility for clean data doesn’t fall on any one person’s shoulders. It is trivial for a computer to identify human error. It is non-trivial for a computer to identify human factors with certainty. 

The ideal BI end-user is an enterprise employee pushing for data-driven insights. When the CFO/ CTO/Head of a department is materially involved in the products or services the company provides, he/she should have direction for some quarterly/annual goal based on a benchmark like increasing the number of employees in their company by a ratio tied to an upcoming event. Some example events are: a product launch, new client contract, or a strategy outlined in a long forgotten memo. In each case, unforeseen considerations are rarely considered without backup. Let’s take a closer look at the list of business forecasting methods as an indication of the complexity introduced.

The ever watchful qualitative forecasting methods includes:

  • Market Research Techniques

  • Past Performance Technique

  • Internal Forecast

  • Direct vs. Indirect Methods

  • Jury of Executive Opinion

  • Delphi Technique

  • Market Survey

  • User’s Expectation Method

  • Brainstorming.

These qualitative forecasting methods harken to the days when good old elbow grease could be used to take into account variance caused by human factors. Remember not to get human factors confused with human error.

Now let’s look at quantitative methods:

  • Business Barometers

  • Method Trend Analysis

  • Method Extrapolation

  • Method Regression

  • Analysis Method

  • Economic Input Output

  • Econometric Model

  • Expectation of Consumer

These methods are dry and boring but full of backup for data-driven decisions.

Why is forecasting “backup”?

The regression analysis method is conducted like another number crunching activity. Thanks to the scholastic pursuits of research scientists there are widely accepted standards for conducting studies involving statistics concerned with verifying the validity of the data. In other words, data points can be used to accurately measure a limited subset of functions. 

Everyone in an organization should be under the impression data is essential to business. A good place to start is semi-skilled employees who are already working within the area of influence for each data-source. Data transformation could also be on a rolling basis, in stages. Considering that during the data transformation, the easiest transition is dependent on who within the organization participates. Any special events coupled with the existing stream of data also signal your digital transformation.

The BI software ecosystem is mature. Regulated industries like insurance, healthcare, real estate, and financial securities have data policies that reach beyond practical uses of data described thus far. Data, AI, and machine learning at end-users fingertips and cloud solutions make BI look like a buzzword. The general consensus is that each business organization’s data is only valuable to that specific organization.

How much value does getting semi-skilled end-users on the bandwagon hold within a business organization?

With no data science knowledge a novice turns into a data-driven superhero with BI tools deployed in his/her area of domain expertise. Often business data is stored somewhere to start with may have no way to effectively manipulate the data for the purposes of analytics. The semi-skilled employee brings value that a good business analyst can build off. Avoid slowing your data transformation by allocating resources to retaining domain-specific expertise held by employee’s, while making sure to break down silos.

If we look at Microsoft Power BI & Azure as the inexpensive leader compared to other upstart new kids on the block. The price estimator ranges from $200 – $800 per month for a starter package. It is a flexible cloud server pricing model that scales with customer needs. Enterprise class customers of Power BI and Azure will need at minimum a data scientist, a business analyst and a few data-driven superhero’s available to maximize their ROI.

Clean data won’t come out of nowhere even with a ton of automation: employee’s are the cornerstone of high-quality data. Of course the analytics team needs to be there to build virtual services to automate and ensure the data gets where it needs to be.

The modern BI stage has options with high usability scores like Looker.com. The costs can be thousands of dollars per month. That price is steep and only in reach for enterprises and service providers that are going to manage an end-to-end solution with Looker. What’s getting people talking about Looker.com is one of the best embedded visualization and dashboard interfaces. 

This article opens the floor to the perspective of BI for end-users of all skill levels. The rest of the series will also explore basic tips and tricks revolving around data-driven insights. Each article can be read out of sequence. The purpose is to take gaps in understanding stemming from BI & analytics espousing it is a perfectly natural progression. The BI & analytics market is large enough that distinct career paths exist. Competition is strong enough that the companies that offer SAAS systems for analytics cater to skilled professionals. 

We covered high-quality data, and looked at employee’s that are well-suited to play a pivotal role in digital transformation with business intelligence. Briefly-touching on the awful truth that relying on data alone is not wise: acknowledging that operating in absence of an alternative to data is like a band-aid. The cold hard fact is that the competitive enterprise landscape creates an ecosystem of software that dwarfs itself in size. The practice of data analytics is specialized enough. Tackling new problems within the same organizational context should introduce data artifacts easily addressed by your digital transformation.

Semi-skilled end-users that share in the adoption of data-driven insights during an organization’s digital transformation was the basis of this article. Covering the various moving parts of BI & analytics with low-level tips and tricks in every article in the series; giving anyone pamphlet-like, how-to guides for removing roadblocks that occur when adopting in-house or outsourced data services prescribed by enterprise grade analytics.

Next in Series releasing weekly. Sign up for the newsletter to get them sent directly to your inbox.