Data continues to grow out of control. In mid-2010, the information universe carried 1.2 zettabytes and 2020 predictions expect nearly 44 times more at 35 zettabytes coming our way. All this big data doesn’t necessarily mean big insights for business. The only way for that is through powerful analytics and interactive reporting tools that deliver high performance results by keeping up with modern chip technology.
Last year we saw just how fast this market is taking off. A slew of deals, including IBM’s acquisition of Netezza and EMC’s of Greenplum validate that this is a big growth market where traditional relational database technology approaches are not adequate to meet insightful business data demands, which may be why IBM considered $1.6 billion a reasonable price. Despite having so many vast data management solutions as well as solid scale out experience, the company needed to integrate a more powerful database into its solutions, because that is what businesses want.
In 2010, we also saw the stack wars heating up. More players moved into vertical integrations. In addition, business intelligence (BI) appliances were spotlighted. Oracle, for instance, announced new BI and OLTP appliances, including ExaLogic and ExaData. In addition, there was a flurry of activity around Hadoop’s spin-off in order to work with unstructured and semi-structured data. We saw many companies — both open source and large proprietary software players — enter the hybrid open source business model.
In 2011, we suspect we’ll hear more about business intelligence, operational business intelligence, the agile enterprise, and the need for more powerful insight tools. However, one of the most important trends connected to all business intelligence will be the continuation of Moore’s Law and the resulting chip scale outs it will enable. Yes, chip scale-outs, not hardware scale-outs.
Moore’s Law continues to live up to its original prediction. Every two years the number of transistors that can be placed on one single integrated circuit doubles every two years, but business software hasn’t been taking advantage or keeping up with this powerful trend. Video games continue to perform better here. This century’s business software applications also haven’t sufficiently kept pace with the explosive growth of business data. The software is often blocked from the speed benefits of all the new chips because it can’t feed them fast enough as data volumes grow. As a result of this intelligence gap between chips and business software, many critical business decisions are made based on inadequate data sets simply because today’s software can’t effectively process enough. Many companies keep throwing hardware (especially more servers) at the problem, and the chip industry’s enormous investment in computer performance continues to sit idly by. In 2011, expect to see more solutions hit the market that enable business software to exploit the tremendous capabilities of modern chips that aren’t currently doing so, rather than forcing chip customization or blindly throwing hardware at the problem.
There is strong evidence that we are going to have the capability to run 256 or 512 cores on a single chip. If companies can do that, it will shrink the amount of hardware required to fuel the big data networks. The network will essentially begin living on the chip, not the hardware in a network. These vast core additions are equivalent to a whole massively parallel processing (MPP) scale-out cluster of 20 servers or more. Imagine all that power on just one chip, and the resulting energy savings in data centers everywhere. In 2011, we will see greater core density per chip.
Memory is going to become a bigger bottleneck, too. The bottleneck will continue to move from disks to RAM (random access memory). As the core density of chips and the RAM size keeps rising dramatically, total in-memory data warehouses are now feasible. In the meantime, albeit at a slightly slower rate, solid state drives will also become more widespread in 2011.
Chip companies will continue to make huge investments in technology, and the connection between business software and analytics will begin to take advantage of those investments. It will become more common place for engineers to stay in tune with chip technology advancements as they build out their solutions to manage and leverage huge volumes of business data.
Some know this as operational BI and it has a big connection to agility. Agility is a concept that incorporates the ideas of flexibility, balance, adaptability, and coordination under one umbrella in business. According to a Harvard Business Review article, the agile enterprise strives to make change a routine part of organizational life to reduce or eliminate the organizational trauma that paralyzes many businesses attempting to adapt to new markets and environments.
Analytics technologies will help businesses be more agile and will become a key business differentiator in 2011 and beyond. We are already seeing this. IBM, for instance, continues to promote actionable insights that data can reveal under its Smarter Planet manifesto, which acknowledges that there are systems within systems. Chaos and complexity is highlighted. Technology companies that help customers get a full, intelligent grip on their data inside these complex systems will win. At Ingres, we remind our customers daily about their ability to take data sets of their choice and get fast, insightful information about it. There are also companies such as Pentaho and ClickView. All of these solutions are designed to offer powerful insights through their data and in our opinion they are ahead of the game in doing so.
This trend also relates to the agile enterprise. There will be functional implementations that will be embedded in customer and employee-facing business applications. Application vendors will want to work with technologies that can ensure their customers can sponsor and create the perfect analytics platform for their big data environment.
Operational Business Intelligence includes any e-commerce or social network applications. It is where analytics gets embedded into the back-end of business systems, and it is different from the traditional model where BI is a front-end tool that a human uses. Analytics is being embedded literally into the business, but it needs to have a reasonable TCO because we are embedding intelligence inside small and inexpensive transactions.
With social networking, for example, companies can’t afford to buy a Teradata machine just to get insights about what is happening in the Twitter universe that impacts their bottom line. The return on investment is often not strong in social networking. Enterprises need improved TCO — which takes us back to agility and helping companies get there. Companies need to be able to rapidly respond to competitor’s new products and fine tune their own offerings with the use of analytics technology. This is not a new idea, but it will grow more important this year.
Hybrid models are being used by companies including JasperSoft, SugarCRM, and Ingres (with our VectorWise database). In the hybrid model, some of the product is available on a fully open source license but some of the special source is only available for a fee. There are actually two trends here. First, the OSS players continue to move to open core. Second, the proprietary players leverage open source.
Take Android for instance. Google is a great example of company taking advantage of open source while their core product remains proprietary. The company leverages open source in their search business and Android in the mobile business, but their overall business model is not an open source one. We can say the same about SaaS players such as Taleo and SuccessFactors. Oracle is a relatively new entrant to the hybrid model as well. In 2011, well-known open source players will continue to move to the hybrid model and leave few, if any, 100 percent pure-play open source companies in their wake.
The traction of subscription models — where businesses can easily get to the technology they need and pay as they use — is a definite mainstay in the big data world. This year, the model will become more granular. Businesses that used to get a subscription for a year will now pay by the hour.
Self-service BI directly hits home for business users on the front lines who count on the information and insights in the reports they create for managers and board members, among others. The intervention of IT still delays getting data to these users. Many enterprises still have to run a myriad of customized reports that cause a huge backlog. Users are frustrated. In 2011, companies will have to accelerate report delivery. When we launched Ingres VectorWise in 2010, it was to give these results faster. Users want insights in moments, not the hours it takes with traditional BI solutions currently available.
In 2009, TDWI published Next Generation Data Warehouse Platforms. One of the interesting points in the report is the number of organizations that want data insights to be a constant. Ninety percent of the participants said they want real-time data insights (not just once a day or a few times a day) over the next three years.
When it comes to traditional database, data warehousing, and BI solutions, especially those sitting atop column-store technology, this just can’t happen. Most solutions on the market today will require a much more powerful analytics in-memory approach. Look for companies in 2011 that are committed to giving businesses 24X7 actionable insights to their businesses. Incumbent technologies are going to struggle to meet this demand.
(Source: TDWI – Ketan Karia)