Thursday, March 31, 2011

Business Analytics and Business Intelligence ground reality

Business intelligence and business analytics aren’t new concepts. The idea of understanding the relationships between bits and bytes of data extends back to the late 1950s, and BI has been around in earnest since the late 1980s. However, today, the ability to aggregate, store, mine and analyze data can make or break an enterprise. As a result, BI and BA have emerged as core tools guiding decisions and strategies for areas as diverse as marketing, credit, research and development, customer care and inventory management.

As CIO.com reports, BI and BA are evolving rapidly and meshing to meet business challenges and create new opportunities. Although nearly all global 5000 organizations already use these tools, 35 percent of them fail to make insightful decisions about significant changes in business and market conditions, according to IT consulting firm Gartner. What’s more, the task isn’t getting any easier as data streams become more intertwined and other Web 2.0 environments pull data from multiple sources at single instance.

I believe business intelligence and business analytics are on the cusp of a major change. There is a shift toward providing deeper insight into business information. And there is a growing emphasis on better tools and putting more powerful and better software in the hands of business decision makers today.

Business Intelligence & Business Analysts are quite disconnected in real world, at least that's what have seen in last many years of experience. BI is evolved as platform or bunch of tools, with architecture, albeit an enterprise-wide approach, lacks deep analytical and predictive capabilities. Traditionally, this is where the work of IT ends and business analytics starts, with statistical, quantitative and predictive work conducted outside of the framework.

This unfortunate reality has contributed to the myth that BA is something totally different from BI. The vision of BI always includes analytics, and BA is merely a subset of BI focused on analytical parts of business intelligence. Because the traditional BI architecture doesn't lend itself to advanced analytics capabilities, such as statistical modeling and data mining, it's not surprising business users collect data and reports from BI systems and then use their own analytics in spreadsheets they control. This approach is not a viable solution however, because uncontrollable processes and questionable data will seriously hamper a BA effort. Research studies estimate that roughly 94 percent of spreadsheets deployed in the field contain errors, and 5+ percent of cells in unaudited spreadsheets contain errors.

What we need is an analytics-oriented BI architecture that incorporates advance analytics and analytic modeling capabilities into the current BI framework. Traditional BI vendors need to build more advanced analytical functionalities within their BI offerings. Many major BI tools don't support advanced statistical and quantitative modeling. Some support limited analytics and require highly technical skills (such as SQL) for use, which most business users don't possess. BI vendors need to provide more user-friendly analytics tools with much broader capabilities for statisticians and business analysts to use without lots of IT support. These new capabilities should include predictive analytics, data mining, text analytics, simulation, decision analysis and advanced modeling.

Second, traditional analytics software vendors need to embed powerful analytical capabilities into the BI platform and make integration much easier for customers. Most BI applications and BA applications operate on very different platforms. Every company needs to reckon with integration and ROI before investment. BI and BA vendors should work together to make the integration much less painful and help customers unleash the best of both worlds.

An integrated solution combines advanced analytics with powerful data visualization and advanced reporting capabilities to support fact-based and data-driven decision-making. Under this new architecture, advanced analytics will be an integral part of BI. Analytics process and technology could be managed under one unified BI framework and strategy that ultimately should align with a company's business strategy. Initiatives such as data management and governance could benefit both BI and BA programs.

Companies that have high quality information that is well-defined and understood across the enterprise already have a solid foundation for BA. In terms of implementation, there could be different deployment approaches based on the conceptual architecture. For instance, analytic models might be built into a database or data warehouse to leverage its processing power.

In-database analytics has lots of advantages - analyzing data where it resides to avoid data movement and duplication. However, in-database analytics can be costly when analytics processes, which are volatile and adaptive in nature (as old models need to be updated or rebuilt with latest data input), are hindering other mission-critical OLTP or OLAP operations. It may lead to a separate environment for development and deployment of an analytic model. Meanwhile, advanced analytics capabilities are better built within existing BI tools for better compatibility and integration with existing BI features. Analytics could also be built into operational systems when less data integration is needed - analyzing data while capturing it. Organizations should choose the best deployment model to fit their business analytical needs.

Lastly, BA needs to be integrated and embedded in business process to be effective & efficient. One such example is to create a closed-loop style repeatable process in the normal workflow of business operations to feed the results back into the operational system where the data for analytics is sourced. This kind of decision automation is used in cases where decisions tend to be high volume. For instance, an online retailer can use an analytical model that predicts high probability of a customer buying a certain new product to attempt cross-selling by dynamically displaying ad banners when the customer visits the online store. An online bank can approve or reject loan applications automatically based on the criteria defined by the application processing rules engine using predictive analytics. Only the exceptions (rejected applications) will be sent to loan officers for review and follow-up. The model significantly reduces the cost and decision time for the bank and customers, a win/win for both.

According to my observations he key characteristics of the analytics-oriented BI architecture are:

  • Integrated (data, reporting, analytics)
  • Robust and flexible (for rapid changes)
  • Evolving and adaptive
  • Consistent (standards in process and data)
  • Controlled
  • Transparent (versus black-box approach) and
  • Embedded (analytics as part of business process)

With the burgeoning demand in advanced analytics and emerging analytical technologies, we will see the convergence of BI and BA in the marketplace. BI megavendors will likely acquire smaller BA players and integrate advanced analytical tools and capabilities into their BI portfolios. At the same time, traditional analytics software vendors will likely push more into the BI platform
territory.The reciprocal penetration will accelerate the consolidation, standardization and adoption of analytics while moving toward an analytics-oriented BI architecture.

Historically, this market has been served by vendors such as Business Objects and Cognos. But the competitive landscape is changing. Microsoft has now shrewdly entered the market by driving the placement of SQL servers into the space in order to broadly deploy and deliver its BI suite and reporting services in volume. Oracle has seen the effect of companies moving data out of the database to stage it for analysis. The resulting data warehouses have provided a degree of utility in housing, manipulating and delivering “strategic” information across the organization.

Also every top level boss wants an effective dashboard. To the extent that all of us are CIO/CTO/CEO’s of our own business discipline, we want a simple measurement display of how we are doing and an alert mechanism of when something goes wrong. Additionally, dashboards address the growing urgency around Sarbanes Oxley. Monitoring planning assumptions and key performance metrics has now become mission critical from a regulatory and compliance standpoint. As we all know BI reporting ends with the dashboard, which is sufficient only for some business planning, and BA picks up the rest for the Go-To Guys. Simply, this group must interact with data in a much different way from what traditional BI allows. The requirement of the BI system has been to monitor the data based on pre-configured questions requiring only a thin client environment to inform the user. In the operating world, users need to engage with the information requiring a richer client to support interactivity and the ability to ask and answer their own question without having to go back to IT. Let us make one thing clear, we don’t get business analytics when you buy business intelligence. The requirements are different and the benefits are different. The return on information and expertise achieved by arming your resources, operating managers with analytics will supercharge your existing BI investment.

Do let me know your views suggestions, These thoughts I have collected from CIO.COM, Linked discussions & various discussions with Co-workers & PMI Mumbai members. Thanks to all for sharing inputs in time with free heart. I am available at ravindrapande at gmail.com

Thursday, March 10, 2011

Estimation Basic Need, options Starting Part 1

As someone correctly point out you can’t control things which you can’t measure. Very true to it’s logical sense. In our software day to day life we need to measure our efforts to track, monitor & control. This thoughts lead me to create this write-up for everyone in IT age to control day to day professional life.

A Software product / projects are typically controlled by four major variables; time, requirements, resources (people, infrastructure/materials, and money), and risks. Unexpected changes in any of these variables will have an impact on an execution. Hence, making good estimates of time and resources required for a project is crucial. Underestimating project needs can cause major problems because there may not be enough time, money, infrastructure/materials, or people to complete the project. Overestimating needs can be very expensive for the organization because a decision may be made to defer the project because it is too expensive or the project is approved but other projects are "starved" because there is less to go around.

In my experience, making estimates of time and resources required for a project is usually a challenge for most project teams and project managers. It could be because they do not have experience doing estimates, they are unfamiliar with the technology being used or the business domain, requirements are unclear, there are dependencies on work being done by others, and so on. These can result in a situation akin to analysis paralysis as the team delays providing any estimates while they try to get a good handle on the requirements, dependencies, and issues. Alternatively, we will produce estimates that are usually highly optimistic as we have ignored items that need to be dealt with. How does one handle situations such as these?

Useful Estimation Techniques

Before we begin, we need to understand & categorize what types of estimates we can provide. Estimates can be roughly divided into these types:

Initial estimates/ Ballpark or order of magnitude: Here the estimate is probably an order of magnitude from the final figure. This can be within two or three times the actual value.

Rough estimates: Here the estimate is closer to the actual value. Ideally it will be about 50% to 100% off the actual value.

Fair estimates: This is a very good estimate. Ideally it will be about 25% to 50% off the actual value.

Deciding which of these three different estimates you can provide is crucial. Fair estimates are possible when you are very familiar with what needs to be done and you have done it many times before. This sort of estimate is possible when doing maintenance type work where the fixes are known, or one is adding well-understood functionality that has been done before. Rough estimates are possible when working with well-understood needs and one is familiar with domain and technology issues. In all other cases, the best we can hope for before we begin is order of magnitude estimates. Some may quibble than order of magnitude estimates are close to no estimate at all! However, they are very valuable because they give the organization and project team some idea of what the project is going to need in terms of time, resources, and money. It is better to know that something is going to take between two and six months to do rather than have no idea how much time it will take. In many cases, we may be able to give more detailed estimates for some items rather than others. For example, we may be able to provide a rough estimate of the infrastructure we need but only an order of magnitude estimate of the people and time needed.

For a given scenario let’s think it thru with a role of a well educated developer, whether any project manager planning for a smooth implementation of a plan or a project sponsor on whose decisions a project depends, you cannot escape from the fact that project estimation is essential to its success. In the first place, there are three basic requirements that a project must satisfy: schedule, budget, and quality. The need to work within these essential project boundaries poses a huge challenge to everyone in the central management team.

There are various aspects that affect project estimates, such as team skills and experience levels, available technology, use of full-time or part-time resources, project quality management, risks, iteration, development environment, requirements, and most of all, the level of commitment of all project members.

Moreover, project estimations do not need to be too complicated. There are tools, methodologies, and best practices that can help project management teams, from sponsors to project managers, agree on estimates and push development efforts forward. Some of these include the following:

Project estimates must be based on the application’s solution, scope and architecture. Making estimates based on an application’s architecture should give you a clear idea of the length of the entire development project phase. Moreover, an architecture-based estimation provides you a macro-level view of the resources needed to complete the project.

Project estimations should also come from the ground up. All estimates must add up, and estimating the collective efforts of the production teams that work on the application’s modules helps identify the number of in-house and outsourced consultants that you need to hire for the entire project, as well as have a clear idea of the collective man-hours required to code modules or finish all features of the application. Ground-up estimates are provided by project team members and do not necessarily match top-level estimates exactly. In this case, it is best to add a percentage of architecture-based estimates to give room to possible reworks, risks, and other events that may or may not be within the control of the project staff.

Do not forget modular estimates. Once you have a clear idea of the architecture, it becomes easier to identify the modules that make up the entirety of the application. Knowing the nature of these modules should help you identify which can be done in-house or onshore, or by an offshore development team. Moreover, given the location and team composition of each development team that works on a module, it becomes easier to identify the technical and financial resources needed to work on the codes.

Development language matters. Whether the development language is Java, .Net, C++ or any other popular language used by software engineers, team that will be hired for the project must be knowledgeable in it. Some development efforts require higher skills in these languages, while some only need basic functional knowledge, and the levels of specialization in any of these languages have corresponding rates. Most of the time, the chosen development language depends on the chosen platform, and certain platforms run on specialized hardware.

You cannot promise upper management dramatic costs from offshoring. While there are greater savings from having development work done by offshore teams composed of workers whose rates are significantly lower from onshore staff, you must consider communication, knowledge transfer, technical set-up, and software installation costs in your financial estimates. Estimating costs is often more about managing expectations, but as the project matures, it should be clearer whether the money spent on it was money that was spent well.

Project estimation software and tools help identify “what-if” scenarios. Over the years, project managers have devised ways to automate project schedule, framework, cost, and staffing estimates. Some estimation applications also have sample historical data or models based on real-world examples. If your business has a lot in common with the samples in the estimation tool, it can help you identify what-if scenarios and in turn include risks, buffers, and iteration estimates.

Price break-down helps in prioritization. Breaking down the total cost of the project helps management decide which parts of a system should be prioritized, delayed, or even cancel. Estimating costs for a new project may not be easy, but project sponsors and managers must be able to know and agree on the breakdown of costs of development, technical requirements, and overhead.

These are some guide lines mentioned to understand the need for estimation & take it as part of daily life to get more control over all software process.

Feel free to reach me at ravindrapande@gmail.com to share thoughts & suggestions.