Director, TDWI Research, Business Intelligence

Posted in BI and Analytics, Information Management on March 12th, 2011 by DStodder – Be the first to comment

That’s my new title. On March 1, I began working with TDWI (The Data Warehousing Institute), an organization well known to business intelligence (BI) and data warehousing professionals for education, training, certification, news and research. The World Conferences, Executive Summits, Solution Summits and other events have long been critical educational gatherings for the professional community engaged in BI, analytics, data warehousing, data integration and other topics. The next event on the TDWI calendar is in Washington, D.C., and I will be there. 

In my new position, I will be heading up research reports; writing blogs, articles and columns; contributing to the conference programs; judging the annual Best Practices Awards; consulting with TDWI members; and more. I will have the pleasure of working closely with Philip Russom, TDWI’s Senior Manager of Research and Services, who will focus more on the data management side. I’ve known Philip going back to my Intelligent Enterprise days, where he wrote an excellent column for us. He is an important thought leader in this industry and has a real skill in educating professionals. And, he’s a pianist and singer, with a fondness for The Great American Songbook. I also look forward to working with all the great people associated with TDWI as speakers, educators and vendors, many of whom I have known and respected throughout my career.

While my focus will be on my work at TDWI, this doesn’t spell the end of this blog. I will definitely be busy, but I hope to keep this spot active with commentary on other areas of interest that may not fit into my work at TDWI. I will have a new blog on the TDWI Web site though, which will have its first installment next week.

Meanwhile, Opening Day, Giants vs. Dodgers, is just around the corner!

From Data Virtualization to Data Services

Posted in BI and Analytics, Data Governance & Policy, Information Management, Virtualization on January 19th, 2011 by DStodder – Be the first to comment

With margins for transactional operations getting thinner, organizations in many industries are focused on leveraging more value from their data. This could not be truer than in the financial services industry, where the onrushing spread of algorithmic trading is changing…well, everything, including the role of information in guiding trading and investment decisions. This rather unnerving article in Wired by Felix Salmon and Jon Stokes captures what’s happening quite well. The speed with which organizations need to turn data into both better business insight and marketable data services has many looking closely at data virtualization.

As an analyst, I am fortunate to be a (remote) participant in the Boulder BI Brain Trust (BBBT), the Boulder, Colorado-based brainchild of Claudia Imhoff. A couple of times a month, this illustrious group of experts gathers for half-day briefings with vendors in the business intelligence, analytics and data warehousing space; the briefings are always highly informative (on Twitter, watch for the hash tag #BBBT). On Friday, January 14, the topic was data virtualization: the BBBT met with Composite Software, a leading vendor of tools and solutions for data virtualization. Composite brought along a customer – and no small customer, either – NYSE Euronext. Emile Werr, NYSE Euronext’s VP of Global Data Services and head of Enterprise Data Architecture gave us a briefing on how the company is developing a data virtualization layer using Composite’s products.

Wikipedia has a good definition of data virtualization: “to integrate data from multiple, disparate sources – anywhere across the extended enterprise – in a unified, logically virtualized manner for consumption by nearly any front-end business solution, including portals, reports, applications, search and more.” As the Wiki entry notes, data virtualization (or “data federation”) is an alternative to data consolidation or replication into data warehouses and data marts. When this concept was first introduced, it was the cause of fiery debates at TDWI events and elsewhere. Now, it has settled in as a complement to the other more entrenched approaches.

Virtualization helps when organizations can’t really wait for standard data integration and loading into a data warehouse. That is very much the challenge facing NYSE Euronext, which is using Composite tools to develop a virtual layer to improve data access for internal executives and to establish a platform for creating data services. “We have so many companies trying to connect into us, and we want to serve standardized information out to companies around the world,” Werr said. NYSE Euronext is moving away from its old method of dumping transaction data into a warehouse; it wants to put more intelligence into the virtual layer. And to help build this layer, it is hiring people with business skills who understand processes and how to derive business value from data. “[These professionals] are the most productive people on my team right now,” he said.

The BBBT session featured an interesting debate about how data governance fits with data virtualization. Can data quality and governance rules be managed and invoked from the virtual layer? Should they be managed at the source or as part of extract, transformation and loading (ETL) processes, as many organizations do now? The discussion began to turn toward master data management and the option of creating a central hub or registry to implement governance for access to multiple sources. Highly regulated industries such as financial services and healthcare should consider this approach because of the need to invoke regulatory provisions for data access and sharing. Werr discussed these requirements and how his organization hopes to use the Composite virtual layer to support metadata governance and access from multiple BI tools.

Putting intelligence into a virtual layer fits with the IT infrastructure trend toward virtualization and cloud computing, and may become even more important because of this trend. Service-oriented applications running on cloud and virtual platforms frequently require access to multiple, disparate data sources. From a business standpoint, data virtualization is going to be critical to moving data quickly from the back office outward, to where it can be packaged into “information-as-a-service” offerings that customers will buy – and that will improve the seller’s profit margins.

2011: A Year for Balancing Priorities

Posted in BI and Analytics, Information Management, IT Industry, Virtualization on January 11th, 2011 by DStodder – 1 Comment

Auld Lang Syne has been sung, but since it is still January, there remains time for New Year’s resolutions. Losing weight, sleeping more, eating vegetables and working less on weekends are perennial favorites; to those I now add “blogging more often.” For some, this resolution would be a cinch. Not for me. In 2010, with one project due after another, I struggled to carve out time to blog. I hope to correct that in 2011, even as I anticipate a new year full of interesting and demanding projects.

In competition with writing one single blog, of course, are all the social media options that have many of us splayed across the Web, including Facebook and Twitter. These need care and feeding, too. In my business, despite its high inanity quotient, Twitter has become essential for communicating news briefs and quick-take analyses. Facebook is many things to many people, but for me it is just a “fun” forum for sharing observations and artifacts along life’s journey. Maybe someday it will be more. Can’t forget LinkedIn. Finally, the social network experience has to include commenting on other people’s blogs, at major news source sites, on Yelp, on Amazon and so on. Have to give the text analytics engines something to chew on!

Most industry pundits have already published their predictions and prognostications for 2011. Rather than add to the pile, I would like to offer a few quick information management “resolutions”: priorities that I believe will shape what happens in 2011.

Integrate the integration. In 2010, the business value of information integration hit home to many organizations. Improved integration can lower the cost of information management and help eliminate downstream business problems caused by poor data quality and inconsistency. Yet, across enterprise departments and business functions there are usually numerous data, application and process information integration steps. With vendors such as IBM, Informatica, Kalido, Oracle and Talend beginning to provide better tools for developing and governing the use of data through master data management and “semantic” metadata layers, organizations have the opportunity to work toward comprehensive, end-to-end visions of information integration.

Don’t be blinded by automated analytics: The good news coming out of 2010 is that advanced analytics involving large (ok, “big”) data sources are becoming mainstream. More organizations than ever before will be able to afford analytics, especially as they deploy data appliances and use services, templates and other tools to shortcut the development of models, variable selection and other steps that are difficult and time-consuming. However, organizations need to “keep it real”: this is important stuff, involving critical decisions about customers, patients, pricing, demand chains, fraud prevention and other factors that are differentiators. Despite the hype, automated analytics are not entirely ready to replace wetware, gut feel or moments of irrational inspiration.

Respect “keeping the lights on.” It’s fashionable these days to dismiss non-strategic IT tasks as merely “keeping the lights on.” I found in 2010 that some of the most complicated and important decisions organizations are making these days have to do with IT infrastructure. Virtualization and cloud computing are completely remaking the map, which means that IT has to move to the next generation of tools and analysis of network optimization, application performance management, dependency mapping and more. Organizations need more sophisticated tools and analysis to make the right decisions about IT infrastructure.

Encourage vendor “coopetition.” The sensational story of Mark Hurd’s departure from HP and resurfacing at Oracle dominated headlines in 2010. The story is not over; InformationWeek’s Bob Evans offers an insightful blog about the continuing friction between HP and Oracle. In the wake of Oracle’s Sun acquisition, the two companies are in the midst of a tectonic shift away from what had been a longstanding partnership. Organizations should remind vendors such as HP and Oracle that despite competitive antagonism, they expect them to work together effectively on behalf of their interests. Customers have that kind of clout. Fasten your seatbelts, though, because I’m sure we’re in for more M&A activity, possibly involving HP and Oracle, which will further reshape the competitive landscape.

Be smart about being “dynamic.” A major watchword this year is “dynamic.” Cloud computing, virtualization, appliances, workload optimization, workforce optimization and other technologies are helping organizations flex to meet changing business requirements without the usual steps of adding people, software and hardware resources that sit idle when not needed. To be more just-in-time requires knowledge; otherwise, organizations could be caught short. In 2011, before going fully dynamic, organizations need to evaluate whether they have adequate knowledge about user and business requirements. If not, it may be time to evaluate tools and practices for understanding workloads, process requirements, dependencies and more. Old ways – and divisions between business and IT – have to change.

That’s all for now. Happy New Year to all!

This Time, It’s Not Just Personal

Posted in BI and Analytics, Information Management on October 30th, 2010 by DStodder – Be the first to comment

Down in the land of razzle-dazzle, IBM drew on the spirit of Las Vegas by “unleashing” Cognos 10 onstage on October 25, with fireworks, explosions and a lot of light fantastic. The Cognos 10 release was the headline of IBM’s Information on Demand (IOD) 2010 Global Conference. Rob Ashe, CEO of Cognos before it was acquired by IBM and now general manager of Business Analytics for the IBM Software Group, said the release, six years in the making, was about “eliminating the boundaries that keep you from gaining and sharing insights.”

Cindy Howson of BIScorecard nicely summarizes the high points in her Intelligent Enterprise blog. She writes: “In theory, though, it is the ability for users to collaborate around a task, with supporting BI dashboards, wikis, blogs, instant messages and emails, where I see the biggest potential.” Cindy is not alone; many analysts and writers are excited about the possibilities surrounding the integration of business intelligence tools and collaboration platforms. Cognos 10 is a big step for IBM in this direction; it takes advantage of Cognos’ standard of innovation in performance management and dashboards to push the collaborative user experience beyond basic capabilities.

Collaborative BI has attracted other big BI vendors, and has lit up innovative new players. Lyza, for example, is an impressive product in this vein; it offers a breakthrough that radically changes how people use business intelligence, or looking at it the other way, how they flow through information in the course of collaboration. Cognos 10 may not induce the same intense reaction as Lyza, but it is definitely a meaningful development, both for the market and for IBM internally.

The collaboration thrust joins Cognos with Lotus through the inclusion of a free license of Lotus Connections. IBM bought Lotus in 1995 for about $3.5 billion, which made it at the time the biggest acquisition in the history of the software industry and signified IBM’s strategic move into applications. Twelve years later, IBM picked up Cognos for nearly $5 billion. With Cognos 10, IBM could be on the road to higher return on investment from these enormous acquisitions as organizations try to use collaboration and social networks to improve employee productivity.

Threading collaborative interaction with business intelligence – or “business insight,” the term Cognos seems to prefer – holds tremendous potential for employee productivity. At IOD, I met with Jeff Shick, vice president of Social Software in the IBM Software Group. We discussed many of promising ideas from the realm of knowledge management: capturing tacit knowledge, finding repeatable patterns in how people use and share information artifacts and the potential of recommendation engines for finding the most relevant information and sources, including people. Knowledge management, once a kind of blue sky topic, never seemed so grounded in the fact-based decisions and actions that people must perform.

Organizations will discover more about their knowledge assets by integrating BI and collaborative tools and practices. The combo could free users from their traditional BI silos that serve single users or communities and allow information insights to flow across organizational divisions and out into the external world.

Of course, combining collaboration and BI is not without challenges. For one thing, many organizations are very careful about who gets BI, and how much they get to use it. Thus, the combination will change user demand for BI and challenge organizations to rethink security, user requirements, data availability and more. But, the catalyst for change is good; organizations should never stop innovating with BI.

The First Night of October Baseball

Posted in Baseball, Life on October 1st, 2010 by admin – Be the first to comment

It’s Friday, October 1st, and it’s tough to concentrate on work today. Those who are not baseball fans, please indulge me a moment to blog about what’s going on with the San Francisco Giants.

Tonight could be the night. At 7:15 Pacific at Third and King, Matt Cain, the Giants pitcher who subdued the Colorado Rockies in Denver last Sunday, is going to lead his teammates onto the field, on his 26th birthday, to face the only team left in the way of a National League West title: The San Diego Padres. The Chicago Cubs did the Giants a favor yesterday, shutting out the Padres in San Diego to reduce the “magic number” to one. With a victory, the Giants clinch the division outright, and head into October baseball. You can bet that beyond just normal fans, the ballclub’s entire ownership group and assembled San Francisco notables will be on hand.

“Baseball is a crazy game,” players always say, so nothing is done until it’s done. The Padres cast a spell on the Giants earlier this season, so the Friars can’t be taken for granted. However, it’s going to be an amazing night at AT&T Park. This is 30th anniversary of the Willie Mac Award; the players and coaches award a player who “best exemplifies the spirit and leadership” of Willie McCovey, the Giants legendary first baseman. McCovey, though still recovering from major back surgery, will be there. The Giants have invited all of the previous winners of the award, including Jack Clark, the Ripper from the late 1970s and early 1980s; Bob Brenly, the popular catcher from the mid-1980s team;  plus Jeff Kent, Robbie Thompson, Mike Krukow, Darrell Evans and the others.

The San Francisco Giants have never won a World Series and have only been there three times, in 1962, 1989 and 2002. The Franchise has not won one since 1954, which I think outside of the Cubs is the longest stretch for any Major League Baseball team. Around the stadium and out on McCovey Cove, where Barry Bonds landed so many splash hits (Pablo Sandoval got one yesterday), there are plaques and monuments everywhere. I love how the Giants management embraces the storied history of this team, increasingly to include its New York era. Monte Irvin had his number retired this season in a very emotional ceremony.

But that means that there a lot of ghosts out there, living and dead, 50-plus years’ worth, who are watching this particular combination of players to see if it is the one that can push the Giants over the top. I’m one of those who became a Giants fan during the great summer of 1978, while a student at UC Berkeley. After years in the doldrums the Giants put together an exciting team at Candlestick Park that included Clark, Evans, McCovey, Bill Madlock, Johnnie LeMaster and very good pitching staff led by Vida Blue. That team faded, as alas many have, but I got hooked on this passionate but ultimately frustrating team. There were the Roger Craig teams of the late 1980s and early 1990s, with Thompson, Matt Williams, Will Clark and Kevin Mitchell. And then there were the Dusty Baker-managed Barry Bonds teams, culminating in the bitter 2002 loss to the Angels. Could this current team end the frustration?

Yesterday, I played hooky and went out to the ballpark on a gorgeous, warm afternoon to see Madison Bumgarner, Buster Posey, Andres Torres and the rest take on the Diamondbacks. Even now, I still see the bowed back of the 21-year old Bumgarner, looking no more than his age, his eyes peering over his glove as Posey flashed the signs. I could feel the intensity of Bumgarner’s concentration even from where I was in the stands. Then he would whip his body in motion and deal – getting himself out of jam after jam and enabling his suddenly “Bye, Bye Baby!” home-run happy teammates to take control.

Tonight, it’ll be Matt Cain’s turn. If he’s successful, San Francisco is going to go full throttle, spilling over with exultant baseball fans. It’ll be wild, noisy and crazy. This is a hungry town. But, as the great Yogi said, “It ain’t over ‘til it’s over,” and so I make no plans. We live in the moment tonight.

Exadata, Exalogic: The iOracle?

Posted in Cloud computing, Information Management, IT Industry on September 28th, 2010 by DStodder – Be the first to comment

“Steve Jobs is my best friend,” said Oracle CEO Larry Ellison, early in his Oracle OpenWorld keynote on September 23. “I love him dearly, and he’s someone I watch very closely, watch what he does over at Apple.” While it’s unlikely that we’ll see iPod or iPad-like billboards featuring silhouettes of IT managers rocking out to their Oracle Exadatas, Ellison is unabashed in expressing his desire to emulate what Jobs has done. “We believe that if you engineer the hardware and software pieces to work together, the overall user experience – the product – is better than if you do just part of the solution.”

Exadata and the new Exalogic Elastic Cloud, which Oracle announced at OpenWorld: Do they represent commoditization or innovation? The answer is both. For obvious reasons, vendors tend not to market their ability to provide commodity products. IT buyers are consumers, too, and “commodity” has that word “common” in it – not too inspiring. In the consumer market, however, Apple has been highly effective at turning mass-produced machines into objects that symbolize innovation. While it’s certainly possible to imagine devices engineered to produce sound that is superlative to an iPod, Apple was the first to create the right package at the right time for the broader market.

Perhaps someday, just like an iPod, you’ll be able to buy an Exadata or Exalogic box from a dispensing machine at the airport. Ok, perhaps not. But Oracle seems convinced that mainstream IT buyers are similarly ready for a “commodity” package. They are weary from the pressure of supporting complex and costly existing systems that give them little room for discretionary spending. Configuration and change management are two of the most time-consuming tasks; it has also been difficult for IT developers to adapt software applications to parallel hardware platforms to make it easier to scale up and out. Plus, having boxes hanging off intelligent networks and existing out in the cloud appears to be the approach taken by architects of many newer systems.

Successful but relatively exotic products exist, such as Netezza (just recently acquired by IBM) and Teradata. In fact, the database market is teeming with more exotic products than at any time in the past 20 years or so. But Oracle is betting that the mainstream market will only adopt the advanced and esoteric stuff if it is inside the box. “The car industry delivers complete cars,” Ellison observed in his keynote. “There are a lot of computer systems involved in running a modern car, but it’s all been engineered and tested to work together – whether it’s a Prius or my favorite commuting car, the Bugatti.”

Given the number of software suppliers that Oracle has acquired in recent years, it’s no surprise that Oracle would be obsessed with product consolidation and integration. However, it was interesting to see how this objective is affecting strategies throughout the company. Business intelligence, for example, “should not be separate,” Ellison said. “The core of the system should be intelligence; it should be everywhere.”

This sort of consolidation and integration can be innovative, in that it will propel the majority of IT buyers into feeling that they are enabling innovation in their businesses by buying packaged machines such as Exadata and Exalogic. But time will tell, and competitors will respond. Early adopter sessions at OpenWorld that I attended made migration to Exadata sound straightforward, with considerable benefits accruing from reductions in cost and time-to-deployment compared to traditional systems. It won’t be on the bleeding edge that Oracle’s success will unfold; it will be in the mainstream.

Oracle Speaks, Hurd Listens

Posted in BI and Analytics, Enterprise Applications, Information Management, IT Industry on September 7th, 2010 by DStodder – Be the first to comment

Former Hewlett-Packard CEO Mark Hurd’s severance vacation and search of the help-wanted listings proved to be short. Oracle announced that Hurd will be joining the company’s board of directors and will assume the position of president. Current co-president Safra Catz is staying, but Charles Phillips has departed. Phillips deserves credit for improving Oracle’s customer relationships during his era; it’s easy to forget now how raw those relationships were at the time he took the helm. However, like the end of Hurd’s reign at HP, Phillips’ more extended finale came with embarrassing personal distractions.

HP is not accepting Hurd’s new position peaceably. The company has filed a lawsuit, noting that by joining Oracle, Hurd is violating a confidentiality agreement and “[putting] HP’s trade secrets and confidential information in peril.” Given that Oracle is increasingly more of a competitor than a partner to HP, it’s easy to see HP’s difficulty with this turn of events. I don’t know whether HP’s lawsuit will succeed in stopping Hurd’s appointment, but a quick read of experts quoted in the business press seems to suggest that it will not.

The arrival of Mark Hurd at Oracle could be a huge development that will send shock waves across the industry. As evidenced by recent trading, investors seem bullish about Oracle’s prospects with Hurd’s arrival. Here as six points that I see as significant with regard to his hiring:

-          Hurd will bring leadership to Oracle’s hardware, systems and data center push. This is new: Oracle will now have a top executive with experience in the hardware and systems business. Oracle’s steady core has always been database software, with services and other types of software and services subject to shifts in strategy and restructuring. Will Hurd, after looking at the business prospects for the Sun offerings, try to change Oracle’s balance to go strong after HP and IBM? Or will Hurd conclude that Oracle needs to mend its relations?

-          Hurd is Oracle’s first strong #2 since Ray Lane. Oracle experienced significant growth during Lane’s years as president and COO with Oracle, but his relationship with Oracle CEO Larry Ellison soured, and he was gone after eight years. Hurd is no shrinking violet, especially after having been the CEO of two different companies. What will happen when Hurd and Ellison disagree? And will they agree on the possible acquisitions that everyone expects will happen?

-          Hurd will be tempted to “cut the fat” inside Oracle. No doubt, part of Wall Street’s positive reaction to Hurd’s appointment reflects the view that he could make Oracle even more profitable by taking an axe to internal costs, as he did at HP, and hold managers more accountable. Oracle is run more efficiently than HP was when Hurd took over, but it is an established company with over 100,000 employees. Surely an outsider with the practiced eye of Hurd will find areas to trim. How will this affect Oracle’s internal psyche?

-          Hurd could invigorate Oracle in the channel. Hurd has a reputation for valuing channel partner relationships – an area where Oracle has had trouble. Hurd could make improvement here a focus, and help Oracle counter stronger direction from IBM and Microsoft.

-          Hurd’s hiring will be a catalyst for change in the industry. If Hurd does make Oracle a more serious player in the server, storage and data center business, competitors will have to react. If its stock price continues to fall, would a Cisco Systems, EMC or even Microsoft make a play for HP? Wild thoughts, but then the IT industry has not really had to deal with Oracle as the sort of broad-based player it could now become.

-          Hurd could give more oomph to Oracle’s BI, analytics and data warehousing. With the 11g releases, Oracle is in the early stages of an important product cycle for business intelligence, analytics and data warehousing. Hurd was clearly a believer in the power of BI at HP; I would expect him to give Oracle’s offerings an even higher profile.

Larry Ellison, whether fired up by a friends’ misfortune or the opportunity to bruise a competitor, has succeeded in planting a couple of banderillas in the HP bull, angering and distracting it. But now, by bringing Hurd on board, he could be about to change Oracle itself.

Good Data Equals Good Health – and Lower Costs

Posted in BI and Analytics, Healthcare, Information Management on August 22nd, 2010 by DStodder – Be the first to comment

I arrived at my first meeting at TDWI in San Diego late, still hyperventilating from legging out a long hot walk from my hotel, where I had dumped my bags after a gnarly drive down to “America’s Finest City” from Los Angeles. So, perhaps appropriately, my first meeting was with a company in the healthcare industry: Blue Cross and Blue Shield of Kansas City, co-winner of TDWI’s Enterprise Data Warehouse Best Practices Award. The company won the award as a customer of HP Business Intelligence Solutions.

The healthcare industry is obviously undergoing tremendous change, with new government policies and economics challenging the business models of insurance and managed care providers such as “Blue KC.” The transition is away from controlling benefits – that is, denying benefits – and increasing rates as the primary tools for managing costs. The future is “wellness,” or helping members get or stay healthy. “We want to improve the health of our members so that they don’t become ‘patients,’ said Darren Taylor, vice president, Information Access Division with Blue KC. “If we can do that, then we can take care of costs from within the context of that objective.”

Wellness requires knowing more about members, which means that the companies need vastly improved data management and analysis. Connecting to disparate data systems and establishing a single enterprise data warehouse (EDW) are proving critical to accomplishing Blue KC’s objectives with its membership. Previously, Blue KC had outsourced diabetes or other disease management programs “to good companies,” Taylor said, “but we did not have enough insight into these proprietary systems.” The company could not integrate or analyze multiple sources of data about one member to understand how, for example, their heart conditions, asthma or other issues were related. Gaining this single view is essential. With the EDW in place, the company is able to bring these disparate data sources in house.

Taylor was VP of the IT group, but his group now reports to Blue KC’s CFO. “There’s more accountability. IT is usually about waiting for requirements. We’re now about anticipating needs, and bringing business and IT together to execute on our data warehouse strategy.”

Hurd at Sea

Posted in BI and Analytics, Information Management on August 20th, 2010 by admin – Be the first to comment

As we pulled out into San Diego harbor, the dusk was calm. The increasing dark settled slowly over the lingering sunset hues of pink and orange. A gentle breeze filled the sails of the big schooner as we began our cruise with members of Hewlett-Packard’s Business Intelligence Solutions group and customers, who were in town this week to attend The Data Warehouse Institute (TDWI) conference. The captain invited guests to help hoist sails, which we did. Now it was time for drinks and hors d’oeuvres. Dinner would follow.

The only disconcerting – but at the same time, entertaining – part of the cruise was what appeared to be a U.S. Navy SEALs training session going on all around us. Helicopters swooped low and gunboats raced by; as darkness fell, more copters lit the water with searchlights, putting the bay under close surveillance. San Diego mostly basks in sunny, happy weather, but overhead and out on the water, you see constant reminders of San Diego’s key role in the serious business of hosting naval forces that defend the country and project American power.

As expected, HP personnel kept their lips sealed about the continuing saga of Mark Hurd, HP’s former CEO, the HP Board and actress/marketing consultant Jodie Fisher. With media such as The New York Times, The Wall Street Journal and the blogosphere filling in details daily, it was hard not to bring it up. Some HP folks smiled guardedly in agreement when it was suggested to them that while the mess is not pleasant, HP might be better off replacing Hurd’s fearsome regime of performance accountability and tight spending control with more creative, inventive and inspiring leadership.

I couldn’t help but reflect on the excitement Hurd generated when he joined HP in 2005 after previously heading up NCR and its then Teradata division. When Hurd became CEO, he made it clear that he “got it” when it came to BI, analytics and data warehousing, both with regard to the company’s internal management and his strategy for HP’s software products and services. He hired Randall Mott away from Dell to become HP’s executive VP and CIO; before his time at Dell, Mott was CIO at Wal-Mart, where he led its successful and influential enterprise data warehouse foray. Hurd directed Mott to consolidate HP’s numerous data centers and data marts to reduce costs and improve BI and analytics.

Under Hurd’s leadership, HP increased investment in database technology, particularly NeoView. However, strategies kept shifting, internal cost control became the focus and NeoView did not have the market impact that HP once hoped it might have. A big challenge was balancing its database software development strategy with its ongoing partnerships with Oracle, Microsoft and other players entrenched in the BI and data warehousing market. So, rather than software, HP’s BI foot forward became consulting services and solutions. In late 2006, HP bought the respected Knightsbridge Solutions, which has become the centerpiece of its BI and data warehouse consulting services. An HP BI Solutions customer, Blue Cross and Blue Shield of Kansas City, was selected as co-winner of TDWI’s 2010 Best Practices Award for Enterprise Data Warehousing.

I had the opportunity to talk to Darren Taylor, vice president, Information Access Division with “Blue KC.” I’ll write about this discussion in my next blog, which will offer some quick takes about my meetings at TDWI in San Diego. Closing out this one, it will be interesting to see if HP installs a new CEO with the same kind of vision Hurd seemed to have about the power of BI and data warehousing – but with more soul, and more consistency in terms of the strategic drive and investment needed to succeed in a tough marketplace.

Par for the Workload

Posted in Cloud computing, Information Management, Virtualization, Workload Optimization on June 22nd, 2010 by DStodder – Be the first to comment

When Graeme McDowell tapped home his putt to seal a championship at the U.S. Open on Sunday (June 20), spectators who packed the stands and stood shoulder-to-shoulder around the green roared their approval. Tiger Woods and Phil Mickelson, the Open’s superstars, were humbled by the Pebble Beach course and its famously changeable weather. The little-known McDowell “survived,” as several commentators put it. But that doesn’t really give him enough credit. He played a smart, safe game that adapted well to course conditions. Graeme McDowell

(Photo credit: Lance Iversen, The Chronicle)

The same might be said about IBM’s technology operations, which in partnership with the U.S. Golf Association’s Digital Media team stood the test of a massive number of virtual fans visiting online and mobile U.S. Open sites. IBM and the USGA said that over four million visitors came to the U.S. Open’s Web site, about 8 percent more than last year. This was the first big year for the mobile site, which had nearly two million visits. A major attraction was the “Playtracker” application, which enabled users to fly over the course and get visualizations of how the course was playing through heat maps based on scoring feeds. You can imagine the potential for future data-driven visualizations based on historical data about courses, players, pin positions on the greens and much more.

IBM’s technology management of the U.S. Open site offered a case example of how virtualization and workload management are becoming the essential ingredients of scalability, availability and agility, certainly for consumer Web sites like the Open’s. The USGA is no stranger to IBM’s virtualization technology; IBM has a close services partnership with the USGA, which includes running a variety of cloud services for the Association from its data center in North Carolina. When I visited the trailer near the Pebble Beach course where Web site and scoring services technicians were holed up, I couldn’t help but be amazed at the simplicity of the dashboards that offered real-time views of workload performance on a virtual platform of servers located across the country.

As John J. Kent, IBM Program Manager for Worldwide Sponsorship Marketing explained, virtualization is critical to utilization efficiency, enabling IBM to combine several workloads onto a single platform. “Virtualization basically makes the distributed environment into a mainframe, which has had this virtualization capability forever,” he said. Kent heads up IBM’s technology partnerships with other events, including this week’s Wimbledon Championships tennis event. Kent said that tennis is actually the more data-rich game, with fans already interested in analysis of “all the potential data points – such as unforced errors and rally counts – that can help you understand the strength of a player’s performance.”

In distributed environments, scaling up has always meant adding more hardware; with virtualization and cloud computing, organizations can avoid the long “cap x” procurement process and simply request more of what they need, and it can be made available rapidly over the network. What’s key, then, is to understand and monitor their workloads so that they can be optimized as demand rises and falls; then, organizations don’t have to spend on procuring enough servers to match peak workloads – but otherwise let them sit idle.

The other performance throttle IBM needed during the Open was to regulate content flow. Bandwidth is now the chief bottleneck; the explosion of advanced mobile devices in particular has moved users ahead of what networking providers are able to offer. IBM and the USGA’s Digital Media team needed the ability to make dynamic decisions about regulating content flow. “We needed to understand content demand well,” said Kent. “We were able to slow scoring updates, for example, if we were reaching a threshold in demand for content access and live streaming.” Thus, workload intelligence is critical to managing unstructured content as much as it is for data.

The USGA needs to provide a rich virtual experience on mobile devices to capture a younger demographic, which is important not only for the continued success of professional golf but also for attracting advertising on its Web site. However, as fans grow more dependent on the experience delivered by their mobile devices, it will be interesting to see if the USGA responds to pressure to allow those who attend the Open to bring them, which they are currently prohibited from doing. While there are good reasons not to have onsite fans working their mobile devices and interrupting the lovely hush before a player takes a swing, I wonder if the USGA will have to bow to the inevitable. Otherwise, fans might prefer to stay outside, where they can enjoy a rich, virtual experience.

But in any case, from an IT perspective, the key to victory in the U.S. Open and similar high performance events is clear: Know the workload and optimize it through the virtualized infrastructure. The victorious Graeme McDowell set a good example.