The key to any effective marketing campaign is to provide customers with the right information at the right time. At each stage in the buying cycle, marketers must determine what information customers need and what type of content and communication would be most effective in delivering that information to them.
With video, marketers now have a powerful tool that integrates text, sound, and still and moving images that can be consumed by customers at a time and place, and on the device of their choosing. The effect of this is that it expands marketing time – marketing now works on a 24×7 cycle, not a 9-5 cycle.
Figuring out how you adapt your content marketing tactics and campaigns to take advantage of video, mobile and the opportunities of devices like iPads is now a priority issue for marketers seeking to expand their marketing time.
Continuing from part 1 of this blog series, we will focus on answer the question “Who are some of the folks working to make Hadoop more accessible to BI users?”
There are multiple Apache Hadoop projects in various stages of maturity to make access to data easier and better. They include Map Reduce, Hive, Pig, and an alphabet soup of others.
At the same time, Hadoop distribution vendors are each working on their own enhancements. Cloudera’s Impala is designed to enable real time, high performance queries against data stored in HDFS or Hbase using SQL syntax. Hortonworks’ Stinger project is designed to dramatically speed up Hive performance and provide more functionality. EMC has integrated its Greenplum MPP database with its Hadpop distribution (called Pivotal) to provide high performance SQL queries with Hadoop. I’m sure the other Hadoop distro vendors are doing similar things as well.
All of the more established BI vendors, including IBM, Microsoft, Microstrategy, Oracle, and SAP are deepening their Hadoop integration solutions as a new generation of market entrants such as Actian, Alteryx, Datameer, Karmasphere, Platfora and others are introducing a range of diverse and innovative Hadoop-based analytics tools.
As in any early stage IT market segment, there’s currently a great deal of complex technical jargon, nuanced functionality, and rapid fire advancements that customers will need to keep up with in selecting the right solutions for their needs. BI and analytics tool vendors introducing new or enhanced solutions for Hadoop need to market to their audience using language that is simple to understand, technically credible, and clearly differentiated.
Ask any business intelligence or analytics vendor to give a few customer examples of how their product is being used with data stored in Hadoop, and you’re likely to get some blank stares. That’s not a slam on the BI vendors. It’s partially an indicator of the state of maturity in the Hadoop market, and partially a reflection of just how hard BI is, regardless of the data source.
Organizations that have deployed and standardized on specific BI tools can’t just point those tools at a schema-less Hadoop cluster and start analyzing. Data needs to be aggregated or indexed. And it took years for SQL-based relational and multidimensional BI technologies to reach a state of maturity where they could deliver enterprise-class scalability, functionality and performance. It’s going to take some time before a new generation of analytic tools will meet the full range of requirements for Hadoop-based analytics.
Forrester analyst Boris Evelson wrote a great blog last September illustrating the issue. He noted that BI vendors are quickly responding to customer interest in Hadoop by announcing Hadoop integration. But he goes on to caution that when it comes to BI tools, not all approaches to Hadoop integration are created equal. In fact, he offers twelve questions customers should ask their BI vendors in order to get clarity on exactly what the vendor’s product can or cannot do.
The good news is that everybody is working on the problem. More to come! Also, be sure to checkout my first blog, Differentiation Across The Apache Hadoop Distribution Vendor Landscape, and second blog, Will Hadoop Replace Replace the Data Warehouse?, of this series.
Bootstrap’s new offices are just 3 or 4 weeks away from completion! Thanks to a mix of the crazy Peninsula commercial real estate market and our growth, we’ve had 3 different offices over the last 3 years but we’re now headed to a permanent home by Cordilleras Creek in San Carlos.
Holly has been getting creative with colors, artwork and furniture and the paint should be hitting walls next week. The Cat5 wiring is installed and we should be moving in over the May 17-20th weekend. Once we have a definitive date, we will let all our customers know as we will need to close the office while Comcast gets us back online.
This is a very exciting time for us. As well as moving into the new office, we’ll also be moving to Microsoft Office 365 (with the help our friends at Coyote Creek) and putting new infrastructure and processes in place to ensure higher quality project delivery and visibility. Stay tuned!
Many people still believe that video is an expensive option and, if you’re not careful, it can be expensive. However, the reality is that the software tools that are now available and the rapidly decreasing costs of video hardware (particularly storage) have driven production costs down to where video should be an affordable option for companies of any size.
Of course, while technology drives costs down, labor rates and the demand for video are driving costs up. This makes it all the more essential that you engage the services of someone like Bootstrap Marketing to be your video “general contractor.” Video has a lot of moving parts and, believe us, there are very few marketing departments that can – or should – become an expert in video. In fact, we often get business from companies that have an in-house video department, because their internal costs are prohibitively expensive and they can’t deliver on time (sounds just like IT, huh?).
So, there’s no reason to be put off by the cost of video, but every reason to talk to a company like Bootstrap who can produce a top quality product at a reasonable price and insulate you from dealing with videographers, actors, presenters, artists, narrators, project managers, cameras, audio recorders, lighting, props, etc. not to mention concepts, scripts, storyboards and graphics.
Bootstrap Marketing, your video general contractor!
In the early days of data warehousing, there was a raging debate between two architectural approaches. There was a camp that advocated Ralph Kimball’s federated data mart architecture, and a camp that advocated Bill Inmon’s enterprise data warehouse architecture.
The old “Kimbalite” vs “Inmonite” discussions of the 1990’s are reminiscent of a similar discussion going on today about the relative merits and promise of Hadoop versus conventional data warehouses built on relational databases. And I suspect the issue will get resolved in a similar fashion. People will get tired of discussing it, and both architectures will co-exist in perfect harmony. Each will find its’ appropriate place in the corporate IT landscape.
There are compelling arguments on each side of the question. Hadoop’s free open source distributions run on low cost commodity hardware, and provide virtually unlimited storage of structured and unstructured data. However, few organizations have stable, production- ready Hadoop deployments. And the tools and technologies currently available for accessing and analyzing Hadoop data are in early stages of maturity. There are issues associated with query performance, the ability to perform real time analytics, and the preference of business analysts and developers to leverage existing SQL skills.
In spite of these to-be-expected early stage challenges, I am coming across some real world use cases for Hadoop-based analytics. At a recent Silicon Valley Forum on Big Data, Pandora’s director of software engineering explained how they have migrated their relational data warehouse to an analytic infrastructure built on Hadoop, using Tableau as the front end to Hive for visualization and analysis.
Data warehouses represent the established technology, and they aren’t likely to go away. Nearly all medium to large scale enterprises have data warehouses and marts in place that took years to build, and they are delivering unquestioned business value. The old axiom “if it ain’t broke, don’t fix it” is hard to argue with. However, data warehouses are not designed to accommodate the increasing volumes of unstructured data from web logs, social media, mobile devices, sensors, medical equipment, industrial machines, and other sources. And there are both economic and performance limitations on the amount of data that can be stored and accessed.
The current industry debate about the relative merits of Hadoop and data warehouses is as lively as the data warehouse architecture debates of the 90’s, but perhaps a bit less controversial and passionate. Co-existence seems to be the prevailing sentiment among most practitioners, as well as the vendors of both Hadoop distributions and traditional data warehousing technologies. Cloudera, Hortonworks, MapR, and more recent Hadoop distro vendors ranging from Intel to WanDisco are promoting side-by-side use case scenarios, while IBM, Oracle, and Teradata are incorporating Hadoop into their core offerings.
So what’s it going to take to ignite more controversy and passion into the debate? In my view, new innovations that make Hadoop data more accessible, more usable, and more relevant to business users will obfuscate the distinctions between Hadoop and the traditional data warehouse. As the lines blur, the debate will intensify. Those innovations are coming to market at a fast and furious pace, forcing organizations to make architectural decisions that will fundamentally determine how effectively they can exploit Big Data. More on that in the third and final installment of this blog series. And make sure to read the first blog of this series, “Differentiation Across the Apache Hadoop Distribution Vendor Landscape.”
I got the quirky barrista at Peet’s this morning! There I was getting bombarded with emails from ISPs and our web-developers about the latest round of brute force cyber attacks targeting WordPress and Joomla websites, when my barrista handed me this prime example of coffee froth art.
It totally cracked me up but luckily I didn’t spill my latte. It’s great when you’re on the receiving end of a random act of kindness so I thought I’d share this. If you like it, please share with someone you think could use a smile!
As many of you know, Bootstrap is very active in the community particularly in the area of education. As part of our commitment, I’ve been participating in a mentoring program at Sequoia High School, a school that serves a very diverse economic and ethnic population. On Friday April 5th, I had the privilege to be part of a program hosted by Cisco called Conexion, during which I got to interview 8 seventeen year old kids.
These kids ran the gamut of the social, economic and confidence spectrum. One kid wanted to open a comic book store, one wanted to be an obstretician, one wanted to be the next Steve Jobs, another wanted/needed to join the Marines to ensure his access to a college education and so it went – each kid had a fascinating and unique story. Interestingly, all but one of them had worked at least two jobs and one of them had borrowed $400 to start a company that assembles and sells shoe cleaning kits. Pretty impressive…and it looks like we’ll hire one of them to be Bootstrap’s first intern.
So, while the media grinds us down with stories about MTV reality stars and try to present them as being representative of today’s youth, let me make a suggestion – try volunteering at the local high school. It will restore your faith in human nature and our future.
We recently managed a very successful user event for one of our customers at a top San Francisco hotel. Everything went well – attendance, speakers, catering and time keeping. Everything except the event video.
We were forced to use the “preferred” (i.e. required) AV company of the hotel – a very common practice in many venues. The AV crew turned up with antiquated equipment, for example, standard definition, tape based video cameras. But it wasn’t just an equipment problem. The AV crew was happy just to turn up, do a “meets minimum” job and head home. As a result, the audio was poor quality, the lighting was bad and lacked any dynamics as the cameras were set up on a “point and forget” basis, that is, they were unattended for some or all of the presentation so if the speakers moved out the frame, there was no cameraman to follow them.
Here are a few best practices we’ve learned over the years of videoing live events:
- Check on equipment – you don’t need to be a video geek to make sure that the AV crew has HD equipment
- Think about lighting – a darkened room may be good for a PowerPoint presentation but it’s not good for video
- Use two cameras if possible – there’s nothing worse than watching a speaker from a single camera angle for 45 minutes
- Make sure the speaker knows where to stand and how to stay on mic – one thing that’s worse than a video shot from single camera is when the speaker disappears off camera or off mic
- If a session features Q&A, make sure that you have a roving mic to capture the question
- Record video and audio separately
- Keep accurate records of each presentation and speaker, make sure you’ve got their slide deck for transitions and captions, or worst case, to match the audio track to the slide deck.
Of course, like any other marketing tool, the key to great video is great content but following these best practices will help ensure that your video is of professional quality and can showcase your content in the best possible way.
Hadoop Spring Roundup - With the Strata Conference, the Gartner BI Summit, and the Hadoop Summit all occurring in the month of March, there is a mountain of new Hadoop-related information and a flurry of new product announcements to absorb. To share our observations and perspectives on the “Spring, 2013 Hadoopalooza”, here is the first of a three part Bootstrap blog series.
The flurry of announcements from Apache Hadoop distribution vendors during the Strata conference last month might lead one to believe that the market is getting a bit overcrowded. A closer inspection however suggests that there may indeed be a number of discreet sub-segments to be served by multiple vendors within the Hadoop market. My conversations with product managers and solution consultants working in a sampling of vendor booths at Strata revealed some very clear distinctions in terms of how each company describes its unique value propositions.
I’ve summarized my interpretation of each vendor’s top line message in italics below. My two to four word summarizations may not accurately reflect the vendor’s intended messages, but they do reflect what I walked away with.
- Cloudera – Big Data Platform. While Apache Hadoop is clearly the centerpiece of Cloudera Enterprise, the company positions itself as a broader Big Data solution provider with adjacent products such as Cloudera Manager, Cloudera Navigator, and Cloudera Impala.
- Hortonworks – 100% Apache Hadoop Distribution. By emphasizing its commitment to community-driven open source, Hortonworks is tapping into a common concern many enterprise software buyers have about vendor lock in.
- MapR – Enterprise-ready. The majority of Hadoop deployments today are relatively small-scale test and development or departmental implementations. MapR is aiming its message toward customers putting Hadoop to work in mission critical production applications, where scalability, availability, reliability, failover and security are essential.
- Intel – Optimized for the processor. A major appeal of Hadoop is that it utilizes low cost commodity hardware for storage and processing. Since the vast majority of those commodity servers happen to be powered by Intel, the company’s message promises the best of both worlds – low cost and great performance.
- EMC Greenplum – Hadoop With SQL – By integrating the Greenplum MPP Advanced SQL platform with Hadoop’s HDFS in its recently announced Pivotal HD product offering, EMC’s messages are centered around accelerated adoption and higher performance. Customers can leverage mainstream SQL skills, and bypass the performance limitations of Apache Hive.
Only time will tell how these differentiated messages resonate with software buyers. The Apache Hadoop distribution market is still in its infancy and still evolving. We’re watching it closely and will continue to share our observations and conclusions as they unfold.