How to Gear Up for Big Data
Big data can be big trouble if it isn’t handled with care. Here are some tips to make the most of that treasure trove of information.
You will need to think about big data.
Big data analysis got its start from the large Web service providers such as Google, Yahoo and Twitter, which all needed to make the most of their user generated data. But enterprises will use big data analysis to stay competitive, and relevant, as well.
You could be a really small company and have a lot of data. A small hedge fund may have terabytes of data, says Jo Maitland, GigaOm research director for big data. In the next couple of years, a wide number of industries—including healthcare, public sector, retail, and manufacturing—will all financially benefit by analyzing more of their data, consulting firm McKinsey and Company anticipated in a recent report.
There is an air of inevitability with Hadoop and big data implementations, says Eric Baldeschwieler, chief technology officer of Hortonworks, a Yahoo spinoff company that offers a Hadoop distribution. It’s applicable to a huge variety of customers. Collecting and analyzing transactional data will give organizations more insight into their customers’ preferences. It can be used to better inform the creation of new products and services, and allow organizations to remedy emerging problems more quickly.
Useful data can come from anywhere (and everywhere).
You may not think you have petabytes of data worth analyzing, but you will, if you don’t already. Big data is collected data that used to be “dropped on the floor,” says Baldeschwieler.
Big data could be your server’s log files, for instance. A server keeps track of everyone who checks into a site, and what pages they visit when they are there. Tracking this data can offer insights into what your customers are looking for. While log data analysis is nothing new, it can be done to dizzying new levels of granularity.
Another source of data will be sensor data. For years now, analysts have been speaking of the Internet of Things, in which cheap sensors are connected to the Internet, offering continual streams of data about their usage. They could come from cars, or bridges, or soda machines. “The real value around the devices is their ability to capture the data, analyze that information and drive business efficiencies,” says Microsoft Windows Embedded General Manager Kevin Dallas.
You will need new expertise for big data.
When setting up a big data analysis system, your biggest hurdle will be finding the right talent who knows how to work the tools to analyze the data, according to Forrester Research analyst James Kobielus.
Big data relies on solid data modeling. Organizations will have to focus on data science, says Kobielus. They have to hire statistical modelers, text mining professionals, people who specialize in sentiment analysis. This may not be the same skill set that today’s analysts versed in business intelligence tools may readily know.
Such people may be in short supply. By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions, McKinsey and Company estimates.
Another skill you will need to have on hand is the ability to wrangle the large amounts of hardware needed to store and parse the data. Managing 100 servers is a fundamentally different problem than handling 10 servers, Maitland points out. You may need to hire a few supercomputer administrators from the local university or research lab.
Big Data doesn’t require organization beforehand.
CIOs who are used to rigorously planning out every sort of data that would go into an Enterprise Data Warehouse (EDW) can breathe a little easier with big data setups. Here, the rule is, collect the data first, and then worry about how you will use it later.
With a data warehouse, you have to lay out the data schema before you can start laying in the data itself. “This basically means you have to know what you are looking for beforehand,” says Jack Norris, vice president of marketing for MapR. As a result, “you are flattening the data and losing some of the granularity,” he says. “Later on, if you change your mind, or want to do a historical analysis, you’ve limited yourself.”
“You can use a [big data repository] as a dumping ground, and run the analysis on top of it, and discover the relationships later,” says Norris. Many organizations may not know what they are looking for until after they’ve culled the data, so this kind of freedom “is kind of a big deal,” he says.
Big Data is not only about Hadoop
When people talk about big data, most times they are referring to the Hadoop data analysis platform. “Hadoop is a hot button initiative, with budgets and people being assigned to it,” in many organizations, Kobielus points out. Ultimately, however, you may go with other software.
Recently legal research giant LexusNexus, no slouch at big data analysis itself, open sourced its own platform for analysis, HPCC Systems. MarkLogic has also outfitted its own database for unstructured data, the MarkLogic Server, for Big Data style jobs as well. Another tool gaining favor in the US is the Splunk search engine, which can be used to search and analysis data generated by machines, such as the log files from a server. “Whatever data you can extract from your logs, there is a good chance that Splunk can help,” notes Curt Monash of Monash Research.
Bloatware, crapware, shovelware: No matter what you call it, the junk that PC makers dump onto new PCs is nothing short of a mess. The situation was thrust into the spotlight last week when it was revealed that several Lenovo PCs were preloaded with "Superfish" adware that actively left users vulnerable to attack. The software compromised secure HTTPS web connections in a quest to inject ads on the sites you visit... and make Lenovo a few nickels.
The seven goals a SaaS security review should address.