Posts Tagged ‘business intelligence’

Director, TDWI Research, Business Intelligence

Posted in BI and Analytics, Information Management on March 12th, 2011 by DStodder – Be the first to comment

That’s my new title. On March 1, I began working with TDWI (The Data Warehousing Institute), an organization well known to business intelligence (BI) and data warehousing professionals for education, training, certification, news and research. The World Conferences, Executive Summits, Solution Summits and other events have long been critical educational gatherings for the professional community engaged in BI, analytics, data warehousing, data integration and other topics. The next event on the TDWI calendar is in Washington, D.C., and I will be there. 

In my new position, I will be heading up research reports; writing blogs, articles and columns; contributing to the conference programs; judging the annual Best Practices Awards; consulting with TDWI members; and more. I will have the pleasure of working closely with Philip Russom, TDWI’s Senior Manager of Research and Services, who will focus more on the data management side. I’ve known Philip going back to my Intelligent Enterprise days, where he wrote an excellent column for us. He is an important thought leader in this industry and has a real skill in educating professionals. And, he’s a pianist and singer, with a fondness for The Great American Songbook. I also look forward to working with all the great people associated with TDWI as speakers, educators and vendors, many of whom I have known and respected throughout my career.

While my focus will be on my work at TDWI, this doesn’t spell the end of this blog. I will definitely be busy, but I hope to keep this spot active with commentary on other areas of interest that may not fit into my work at TDWI. I will have a new blog on the TDWI Web site though, which will have its first installment next week.

Meanwhile, Opening Day, Giants vs. Dodgers, is just around the corner!

From Data Virtualization to Data Services

Posted in BI and Analytics, Data Governance & Policy, Information Management, Virtualization on January 19th, 2011 by DStodder – Be the first to comment

With margins for transactional operations getting thinner, organizations in many industries are focused on leveraging more value from their data. This could not be truer than in the financial services industry, where the onrushing spread of algorithmic trading is changing…well, everything, including the role of information in guiding trading and investment decisions. This rather unnerving article in Wired by Felix Salmon and Jon Stokes captures what’s happening quite well. The speed with which organizations need to turn data into both better business insight and marketable data services has many looking closely at data virtualization.

As an analyst, I am fortunate to be a (remote) participant in the Boulder BI Brain Trust (BBBT), the Boulder, Colorado-based brainchild of Claudia Imhoff. A couple of times a month, this illustrious group of experts gathers for half-day briefings with vendors in the business intelligence, analytics and data warehousing space; the briefings are always highly informative (on Twitter, watch for the hash tag #BBBT). On Friday, January 14, the topic was data virtualization: the BBBT met with Composite Software, a leading vendor of tools and solutions for data virtualization. Composite brought along a customer – and no small customer, either – NYSE Euronext. Emile Werr, NYSE Euronext’s VP of Global Data Services and head of Enterprise Data Architecture gave us a briefing on how the company is developing a data virtualization layer using Composite’s products.

Wikipedia has a good definition of data virtualization: “to integrate data from multiple, disparate sources – anywhere across the extended enterprise – in a unified, logically virtualized manner for consumption by nearly any front-end business solution, including portals, reports, applications, search and more.” As the Wiki entry notes, data virtualization (or “data federation”) is an alternative to data consolidation or replication into data warehouses and data marts. When this concept was first introduced, it was the cause of fiery debates at TDWI events and elsewhere. Now, it has settled in as a complement to the other more entrenched approaches.

Virtualization helps when organizations can’t really wait for standard data integration and loading into a data warehouse. That is very much the challenge facing NYSE Euronext, which is using Composite tools to develop a virtual layer to improve data access for internal executives and to establish a platform for creating data services. “We have so many companies trying to connect into us, and we want to serve standardized information out to companies around the world,” Werr said. NYSE Euronext is moving away from its old method of dumping transaction data into a warehouse; it wants to put more intelligence into the virtual layer. And to help build this layer, it is hiring people with business skills who understand processes and how to derive business value from data. “[These professionals] are the most productive people on my team right now,” he said.

The BBBT session featured an interesting debate about how data governance fits with data virtualization. Can data quality and governance rules be managed and invoked from the virtual layer? Should they be managed at the source or as part of extract, transformation and loading (ETL) processes, as many organizations do now? The discussion began to turn toward master data management and the option of creating a central hub or registry to implement governance for access to multiple sources. Highly regulated industries such as financial services and healthcare should consider this approach because of the need to invoke regulatory provisions for data access and sharing. Werr discussed these requirements and how his organization hopes to use the Composite virtual layer to support metadata governance and access from multiple BI tools.

Putting intelligence into a virtual layer fits with the IT infrastructure trend toward virtualization and cloud computing, and may become even more important because of this trend. Service-oriented applications running on cloud and virtual platforms frequently require access to multiple, disparate data sources. From a business standpoint, data virtualization is going to be critical to moving data quickly from the back office outward, to where it can be packaged into “information-as-a-service” offerings that customers will buy – and that will improve the seller’s profit margins.

2011: A Year for Balancing Priorities

Posted in BI and Analytics, Information Management, IT Industry, Virtualization on January 11th, 2011 by DStodder – 1 Comment

Auld Lang Syne has been sung, but since it is still January, there remains time for New Year’s resolutions. Losing weight, sleeping more, eating vegetables and working less on weekends are perennial favorites; to those I now add “blogging more often.” For some, this resolution would be a cinch. Not for me. In 2010, with one project due after another, I struggled to carve out time to blog. I hope to correct that in 2011, even as I anticipate a new year full of interesting and demanding projects.

In competition with writing one single blog, of course, are all the social media options that have many of us splayed across the Web, including Facebook and Twitter. These need care and feeding, too. In my business, despite its high inanity quotient, Twitter has become essential for communicating news briefs and quick-take analyses. Facebook is many things to many people, but for me it is just a “fun” forum for sharing observations and artifacts along life’s journey. Maybe someday it will be more. Can’t forget LinkedIn. Finally, the social network experience has to include commenting on other people’s blogs, at major news source sites, on Yelp, on Amazon and so on. Have to give the text analytics engines something to chew on!

Most industry pundits have already published their predictions and prognostications for 2011. Rather than add to the pile, I would like to offer a few quick information management “resolutions”: priorities that I believe will shape what happens in 2011.

Integrate the integration. In 2010, the business value of information integration hit home to many organizations. Improved integration can lower the cost of information management and help eliminate downstream business problems caused by poor data quality and inconsistency. Yet, across enterprise departments and business functions there are usually numerous data, application and process information integration steps. With vendors such as IBM, Informatica, Kalido, Oracle and Talend beginning to provide better tools for developing and governing the use of data through master data management and “semantic” metadata layers, organizations have the opportunity to work toward comprehensive, end-to-end visions of information integration.

Don’t be blinded by automated analytics: The good news coming out of 2010 is that advanced analytics involving large (ok, “big”) data sources are becoming mainstream. More organizations than ever before will be able to afford analytics, especially as they deploy data appliances and use services, templates and other tools to shortcut the development of models, variable selection and other steps that are difficult and time-consuming. However, organizations need to “keep it real”: this is important stuff, involving critical decisions about customers, patients, pricing, demand chains, fraud prevention and other factors that are differentiators. Despite the hype, automated analytics are not entirely ready to replace wetware, gut feel or moments of irrational inspiration.

Respect “keeping the lights on.” It’s fashionable these days to dismiss non-strategic IT tasks as merely “keeping the lights on.” I found in 2010 that some of the most complicated and important decisions organizations are making these days have to do with IT infrastructure. Virtualization and cloud computing are completely remaking the map, which means that IT has to move to the next generation of tools and analysis of network optimization, application performance management, dependency mapping and more. Organizations need more sophisticated tools and analysis to make the right decisions about IT infrastructure.

Encourage vendor “coopetition.” The sensational story of Mark Hurd’s departure from HP and resurfacing at Oracle dominated headlines in 2010. The story is not over; InformationWeek’s Bob Evans offers an insightful blog about the continuing friction between HP and Oracle. In the wake of Oracle’s Sun acquisition, the two companies are in the midst of a tectonic shift away from what had been a longstanding partnership. Organizations should remind vendors such as HP and Oracle that despite competitive antagonism, they expect them to work together effectively on behalf of their interests. Customers have that kind of clout. Fasten your seatbelts, though, because I’m sure we’re in for more M&A activity, possibly involving HP and Oracle, which will further reshape the competitive landscape.

Be smart about being “dynamic.” A major watchword this year is “dynamic.” Cloud computing, virtualization, appliances, workload optimization, workforce optimization and other technologies are helping organizations flex to meet changing business requirements without the usual steps of adding people, software and hardware resources that sit idle when not needed. To be more just-in-time requires knowledge; otherwise, organizations could be caught short. In 2011, before going fully dynamic, organizations need to evaluate whether they have adequate knowledge about user and business requirements. If not, it may be time to evaluate tools and practices for understanding workloads, process requirements, dependencies and more. Old ways – and divisions between business and IT – have to change.

That’s all for now. Happy New Year to all!

This Time, It’s Not Just Personal

Posted in BI and Analytics, Information Management on October 30th, 2010 by DStodder – Be the first to comment

Down in the land of razzle-dazzle, IBM drew on the spirit of Las Vegas by “unleashing” Cognos 10 onstage on October 25, with fireworks, explosions and a lot of light fantastic. The Cognos 10 release was the headline of IBM’s Information on Demand (IOD) 2010 Global Conference. Rob Ashe, CEO of Cognos before it was acquired by IBM and now general manager of Business Analytics for the IBM Software Group, said the release, six years in the making, was about “eliminating the boundaries that keep you from gaining and sharing insights.”

Cindy Howson of BIScorecard nicely summarizes the high points in her Intelligent Enterprise blog. She writes: “In theory, though, it is the ability for users to collaborate around a task, with supporting BI dashboards, wikis, blogs, instant messages and emails, where I see the biggest potential.” Cindy is not alone; many analysts and writers are excited about the possibilities surrounding the integration of business intelligence tools and collaboration platforms. Cognos 10 is a big step for IBM in this direction; it takes advantage of Cognos’ standard of innovation in performance management and dashboards to push the collaborative user experience beyond basic capabilities.

Collaborative BI has attracted other big BI vendors, and has lit up innovative new players. Lyza, for example, is an impressive product in this vein; it offers a breakthrough that radically changes how people use business intelligence, or looking at it the other way, how they flow through information in the course of collaboration. Cognos 10 may not induce the same intense reaction as Lyza, but it is definitely a meaningful development, both for the market and for IBM internally.

The collaboration thrust joins Cognos with Lotus through the inclusion of a free license of Lotus Connections. IBM bought Lotus in 1995 for about $3.5 billion, which made it at the time the biggest acquisition in the history of the software industry and signified IBM’s strategic move into applications. Twelve years later, IBM picked up Cognos for nearly $5 billion. With Cognos 10, IBM could be on the road to higher return on investment from these enormous acquisitions as organizations try to use collaboration and social networks to improve employee productivity.

Threading collaborative interaction with business intelligence – or “business insight,” the term Cognos seems to prefer – holds tremendous potential for employee productivity. At IOD, I met with Jeff Shick, vice president of Social Software in the IBM Software Group. We discussed many of promising ideas from the realm of knowledge management: capturing tacit knowledge, finding repeatable patterns in how people use and share information artifacts and the potential of recommendation engines for finding the most relevant information and sources, including people. Knowledge management, once a kind of blue sky topic, never seemed so grounded in the fact-based decisions and actions that people must perform.

Organizations will discover more about their knowledge assets by integrating BI and collaborative tools and practices. The combo could free users from their traditional BI silos that serve single users or communities and allow information insights to flow across organizational divisions and out into the external world.

Of course, combining collaboration and BI is not without challenges. For one thing, many organizations are very careful about who gets BI, and how much they get to use it. Thus, the combination will change user demand for BI and challenge organizations to rethink security, user requirements, data availability and more. But, the catalyst for change is good; organizations should never stop innovating with BI.

Oracle Speaks, Hurd Listens

Posted in BI and Analytics, Enterprise Applications, Information Management, IT Industry on September 7th, 2010 by DStodder – Be the first to comment

Former Hewlett-Packard CEO Mark Hurd’s severance vacation and search of the help-wanted listings proved to be short. Oracle announced that Hurd will be joining the company’s board of directors and will assume the position of president. Current co-president Safra Catz is staying, but Charles Phillips has departed. Phillips deserves credit for improving Oracle’s customer relationships during his era; it’s easy to forget now how raw those relationships were at the time he took the helm. However, like the end of Hurd’s reign at HP, Phillips’ more extended finale came with embarrassing personal distractions.

HP is not accepting Hurd’s new position peaceably. The company has filed a lawsuit, noting that by joining Oracle, Hurd is violating a confidentiality agreement and “[putting] HP’s trade secrets and confidential information in peril.” Given that Oracle is increasingly more of a competitor than a partner to HP, it’s easy to see HP’s difficulty with this turn of events. I don’t know whether HP’s lawsuit will succeed in stopping Hurd’s appointment, but a quick read of experts quoted in the business press seems to suggest that it will not.

The arrival of Mark Hurd at Oracle could be a huge development that will send shock waves across the industry. As evidenced by recent trading, investors seem bullish about Oracle’s prospects with Hurd’s arrival. Here as six points that I see as significant with regard to his hiring:

-          Hurd will bring leadership to Oracle’s hardware, systems and data center push. This is new: Oracle will now have a top executive with experience in the hardware and systems business. Oracle’s steady core has always been database software, with services and other types of software and services subject to shifts in strategy and restructuring. Will Hurd, after looking at the business prospects for the Sun offerings, try to change Oracle’s balance to go strong after HP and IBM? Or will Hurd conclude that Oracle needs to mend its relations?

-          Hurd is Oracle’s first strong #2 since Ray Lane. Oracle experienced significant growth during Lane’s years as president and COO with Oracle, but his relationship with Oracle CEO Larry Ellison soured, and he was gone after eight years. Hurd is no shrinking violet, especially after having been the CEO of two different companies. What will happen when Hurd and Ellison disagree? And will they agree on the possible acquisitions that everyone expects will happen?

-          Hurd will be tempted to “cut the fat” inside Oracle. No doubt, part of Wall Street’s positive reaction to Hurd’s appointment reflects the view that he could make Oracle even more profitable by taking an axe to internal costs, as he did at HP, and hold managers more accountable. Oracle is run more efficiently than HP was when Hurd took over, but it is an established company with over 100,000 employees. Surely an outsider with the practiced eye of Hurd will find areas to trim. How will this affect Oracle’s internal psyche?

-          Hurd could invigorate Oracle in the channel. Hurd has a reputation for valuing channel partner relationships – an area where Oracle has had trouble. Hurd could make improvement here a focus, and help Oracle counter stronger direction from IBM and Microsoft.

-          Hurd’s hiring will be a catalyst for change in the industry. If Hurd does make Oracle a more serious player in the server, storage and data center business, competitors will have to react. If its stock price continues to fall, would a Cisco Systems, EMC or even Microsoft make a play for HP? Wild thoughts, but then the IT industry has not really had to deal with Oracle as the sort of broad-based player it could now become.

-          Hurd could give more oomph to Oracle’s BI, analytics and data warehousing. With the 11g releases, Oracle is in the early stages of an important product cycle for business intelligence, analytics and data warehousing. Hurd was clearly a believer in the power of BI at HP; I would expect him to give Oracle’s offerings an even higher profile.

Larry Ellison, whether fired up by a friends’ misfortune or the opportunity to bruise a competitor, has succeeded in planting a couple of banderillas in the HP bull, angering and distracting it. But now, by bringing Hurd on board, he could be about to change Oracle itself.

Good Data Equals Good Health – and Lower Costs

Posted in BI and Analytics, Healthcare, Information Management on August 22nd, 2010 by DStodder – Be the first to comment

I arrived at my first meeting at TDWI in San Diego late, still hyperventilating from legging out a long hot walk from my hotel, where I had dumped my bags after a gnarly drive down to “America’s Finest City” from Los Angeles. So, perhaps appropriately, my first meeting was with a company in the healthcare industry: Blue Cross and Blue Shield of Kansas City, co-winner of TDWI’s Enterprise Data Warehouse Best Practices Award. The company won the award as a customer of HP Business Intelligence Solutions.

The healthcare industry is obviously undergoing tremendous change, with new government policies and economics challenging the business models of insurance and managed care providers such as “Blue KC.” The transition is away from controlling benefits – that is, denying benefits – and increasing rates as the primary tools for managing costs. The future is “wellness,” or helping members get or stay healthy. “We want to improve the health of our members so that they don’t become ‘patients,’ said Darren Taylor, vice president, Information Access Division with Blue KC. “If we can do that, then we can take care of costs from within the context of that objective.”

Wellness requires knowing more about members, which means that the companies need vastly improved data management and analysis. Connecting to disparate data systems and establishing a single enterprise data warehouse (EDW) are proving critical to accomplishing Blue KC’s objectives with its membership. Previously, Blue KC had outsourced diabetes or other disease management programs “to good companies,” Taylor said, “but we did not have enough insight into these proprietary systems.” The company could not integrate or analyze multiple sources of data about one member to understand how, for example, their heart conditions, asthma or other issues were related. Gaining this single view is essential. With the EDW in place, the company is able to bring these disparate data sources in house.

Taylor was VP of the IT group, but his group now reports to Blue KC’s CFO. “There’s more accountability. IT is usually about waiting for requirements. We’re now about anticipating needs, and bringing business and IT together to execute on our data warehouse strategy.”

Hurd at Sea

Posted in BI and Analytics, Information Management on August 20th, 2010 by admin – Be the first to comment

As we pulled out into San Diego harbor, the dusk was calm. The increasing dark settled slowly over the lingering sunset hues of pink and orange. A gentle breeze filled the sails of the big schooner as we began our cruise with members of Hewlett-Packard’s Business Intelligence Solutions group and customers, who were in town this week to attend The Data Warehouse Institute (TDWI) conference. The captain invited guests to help hoist sails, which we did. Now it was time for drinks and hors d’oeuvres. Dinner would follow.

The only disconcerting – but at the same time, entertaining – part of the cruise was what appeared to be a U.S. Navy SEALs training session going on all around us. Helicopters swooped low and gunboats raced by; as darkness fell, more copters lit the water with searchlights, putting the bay under close surveillance. San Diego mostly basks in sunny, happy weather, but overhead and out on the water, you see constant reminders of San Diego’s key role in the serious business of hosting naval forces that defend the country and project American power.

As expected, HP personnel kept their lips sealed about the continuing saga of Mark Hurd, HP’s former CEO, the HP Board and actress/marketing consultant Jodie Fisher. With media such as The New York Times, The Wall Street Journal and the blogosphere filling in details daily, it was hard not to bring it up. Some HP folks smiled guardedly in agreement when it was suggested to them that while the mess is not pleasant, HP might be better off replacing Hurd’s fearsome regime of performance accountability and tight spending control with more creative, inventive and inspiring leadership.

I couldn’t help but reflect on the excitement Hurd generated when he joined HP in 2005 after previously heading up NCR and its then Teradata division. When Hurd became CEO, he made it clear that he “got it” when it came to BI, analytics and data warehousing, both with regard to the company’s internal management and his strategy for HP’s software products and services. He hired Randall Mott away from Dell to become HP’s executive VP and CIO; before his time at Dell, Mott was CIO at Wal-Mart, where he led its successful and influential enterprise data warehouse foray. Hurd directed Mott to consolidate HP’s numerous data centers and data marts to reduce costs and improve BI and analytics.

Under Hurd’s leadership, HP increased investment in database technology, particularly NeoView. However, strategies kept shifting, internal cost control became the focus and NeoView did not have the market impact that HP once hoped it might have. A big challenge was balancing its database software development strategy with its ongoing partnerships with Oracle, Microsoft and other players entrenched in the BI and data warehousing market. So, rather than software, HP’s BI foot forward became consulting services and solutions. In late 2006, HP bought the respected Knightsbridge Solutions, which has become the centerpiece of its BI and data warehouse consulting services. An HP BI Solutions customer, Blue Cross and Blue Shield of Kansas City, was selected as co-winner of TDWI’s 2010 Best Practices Award for Enterprise Data Warehousing.

I had the opportunity to talk to Darren Taylor, vice president, Information Access Division with “Blue KC.” I’ll write about this discussion in my next blog, which will offer some quick takes about my meetings at TDWI in San Diego. Closing out this one, it will be interesting to see if HP installs a new CEO with the same kind of vision Hurd seemed to have about the power of BI and data warehousing – but with more soul, and more consistency in terms of the strategic drive and investment needed to succeed in a tough marketplace.

Informatica and the Identity Opportunity

Posted in BI and Analytics, Data Governance & Policy, Information Management on March 8th, 2010 by admin – Be the first to comment

As we move further into our information-rich age of multiple sales and service channels, social media and surveillance, identity is becoming a hot topic. First, there’s identity theft: According to a recent study by the Ponemon Institute (sponsored by Experian’s and reported by The Medical News), “nearly 1.5 million Americans have been victims of medical identity theft.” Credit fraud, reputation fraud and more are additional negative results of having sensitive information about ourselves spread across the information ecosphere.

Then, there’s identity surveillance. Law enforcement and intelligence services must deal every day with identity confusion as they try to work within legal constraints to find wanted criminals and potential terrorists. Adding complexity, law enforcement will need to determine identity not just from traditional data but multimedia as well; an example is this current caper reported by the Tallahassee (Florida) Democrat.

Identity surveillance and watch lists are rising as political and policy challenges. Canada and the United States are in the news here and here, tussling over implementation of Secure Flight, the plan to collect more passenger data for watch lists that will be implemented by the Transportation Security Administration of the U.S. Department of Homeland Security. See this Intelligent Enterprise blog from last June by Rajan Chandras for some background.

In the middle of all of this are software providers, primarily IBM InfoSphere Identity Insight Solutions, Infoglide (which is providing software for the DHS) and Informatica. In February, I attended the Informatica Analyst Conference and had a chance to talk to execs there about the Informatica Identity Resolution (IIR) solution and how it fits with other solutions and technologies such as master data management (MDM). I came away with a strong sense of how IIR is opening doors to new business opportunities for Informatica in government, but also potentially in areas where Informatica has greater market strength but where identity recognition and resolution software has not traditionally been applied.

Identity recognition and resolution systems enable organizations to use data matches to gain a better understanding of identity across multiple systems. This could include not just individual identities but also networks and relationships: that is, who people know and how they are connected. The tools generally apply algorithms and rules engines to automate and systematize steps that would obviously take gumshoe detectives far longer as they seek clues, patterns and a risk assessment about possible terrorists, fraudsters, money launderers and regulatory violators.

When Informatica acquired Identity Systems from Nokia in the spring of 2008, it looked like simply a smart addition to the company’s data quality toolbox. However, it is clear now that the acquisition was one of a series of decisive steps that have turned Informatica into a more broadly relevant information management (IM) solutions provider. The Identity Systems deal was followed in 2009 by the acquisition of AddressDoctor GmbH, a tool for postal address cleansing and verification. And of course, Informatica recently made its biggest move early this year by acquiring Siperian, a provider of MDM tools.

IIR is an important component of Informatica’s complete MDM solution, and will help organizations implementing MDM gain the much-sought single view of identities (customers, patients, criminals and more) across multiple data sources. A key capability to look for in identity recognition and resolution tools is functionality in multiple languages and countries. Combined with AddressDoctor, Informatica has tools for locating and matching identities around the world. And thinking beyond law enforcement uses, global corporations with diverse markets need better tools for identity network analysis to improve marketing, billing, service and more, especially in this age of social media.

IIR can also help internally, given that data is often hidden in applications and obscure databases. A healthcare firm at the Analyst conference described how it is using IIR for operations between its mainframes and users’ 30,000 Microsoft Access databases. Finally, one of the more interesting technology pairings I learned about at the conference was the real-time application of IIR for “identity-aware” event processing using Informatica’s CEP engine Agent Logic. Watch lists and other espionage uses are an obvious application of this combination, but it could also be applied in systems for financial services, healthcare, retail and other industries.

In the olden days, identity might have seemed a simpler, more innocent matter, although viewing film noir and reading detective novels from the ‘40s and ‘50s might make you wonder. Today, however, there’s no question that identity is a complex topic that includes sensitive political and privacy ramifications. Software providers such as Informatica should be in for a wild ride.

You Had to Tweet There

Posted in BI and Analytics, Information Management, Social Media & Behavior on March 5th, 2010 by admin – Be the first to comment

Tweeting has become a fact of life at industry analyst briefings, and at perhaps more social events than I’m even aware of. Cocktail parties have become tweetups. People get married, break up and live together tweeting. If Archie and Edith Bunker were still on television, we would watch them tweeting.

Up until the last two weeks, I had not been much of a tweeter – and felt guilty about it. I had Twitter followers and followed people. I even wrote about Twitter and its impact; but frankly, it had had very little impact on me. With my infrequent participation, I felt unworthy of followship. And when I did go to Twitter, it was joining conversations in progress and I couldn’t flow with it.

My Twitter awakening took place at the Informatica Analyst Conference, held on February 9 and 10. I attended this event – I was there in person, listening to many fine presentations from Informatica executives. As has been my habit for two decades, I opened up my computer and started to take notes, listening and watching carefully. However, it wasn’t until I logged onto Twitter and checked in on the hash tag (#infaanalyst) that I was really there.

Or not there: It was hard to decide whether being involved in the Twitter conversation was a distraction or an enhancement. It was sort of exhilarating, kind of like surfing, with a mass of water moving below your feet, or in this case, my fingertips. Yet, Informatica people were watching the tweets carefully, and while they did not join the stream, they were responding to tweets during their presentations. Sohaib Abbasi, Informatica chairman and CEO, even picked up my tweet about the role of the CIO and offered insights during his remarks.

Convinced that Twitter was important, I made a point of following tweets from the SAS industry analyst conference earlier this past week (#sassb), since I was not physically there. Many of the same analysts who were at the Informatica conference were tweeting from this event. Some tweets were matter-of-fact restatements of what SAS was presenting, as if reporting to the outside world. These offered narrative value, but given Twitter’s character limit, they couldn’t provide much beyond headlines. Sometimes the NDA (nondisclosure) curtain would fall and there would be silence. Most other tweets were a combination of opinions, humorous asides, kudos, complaints and half-formed questions. An ensemble narrative it was not; since I was having trouble following the thread, I finally logged off and turned to other matters. My conclusion: You had to be there.

Then today, I had a third type of Twitter experience. I participated in the Boulder BI Brain Trust meeting with Hewlett Packard’s Business Intelligence Solutions group, represented by John Santaferraro, senior director of Marketing Communications and Industry Marketing. This time, while not physically there, I was dialed in by phone – and was on Twitter (#bbbt). This tweet stream was more like a parallel reality; HP did not really respond to tweets as Informatica had, but the flow seemed more sensible because I was hearing the presentation in real time, alongside the real-time tweet stream. Of course, tweet streams are real time and nothing else; when I go back and review the presentation and my notes later, the stream won’t be there (maybe I could hunt it down, but I won’t).

In the analyst business these days, tweeting is obligatory, as it is for marketing and public relations. I’m initiated now, and will tweet more. But, are the tweets of any use to anyone not physically there, or part of the tight community of tweeters? I think the jury is out on that. Can you follow a hash tag and “be there”?  No, at least not yet. It’s more like archeology, where you piece together disparate pieces and try to form a narrative. In real time.

Pivot Me Up, Scotty

Posted in BI and Analytics, Business and the Economy, Information Management on January 31st, 2010 by DStodder – Be the first to comment

Walking down the street in San Francisco, I passed a newspaper box – something that in our increasingly digital age may someday be found only at the Smithsonian Museum. “Obama Pivots to Job Creation,” the San Francisco Chronicle headline announced. Wow, there it is again, I thought to myself, the word “pivot.”  All week, in vendor briefings, in my research, on TV during talking-head discussions of Obama’s strategy and now on the front page of a newspaper, I had been encountering this word. So, I took a picture of the box.

The verb form of the word came from its use as a noun, which means “a shaft or pin on which something turns” (Merriam-Webster’s). The implicit meaning of the headline, appearing the day after Obama’s State of the Union speech, was that the Obama Administration was going to turn its attention away from the now-stalled healthcare reform effort and toward the economic problem of creating jobs. This “pivot” would be quick and complete, like a machine would do it. No angst or mess: Done.

However, given that journalists and opinion-makers seemed to have picked up the word directly from Obama’s strategists, I wonder if the strategists’ use of the word comes more from basketball. Obama is as we know a major fan, and he plays the game. In basketball, once you set your pivot foot, you can spin around, but you cannot move that foot or else you’ll be called for traveling. I found a good explanation (and coaching tip, in case you need it) on YouTube. So, maybe it means that Obama has set his pivot foot – perhaps he did so on the day he took office – but he has the ability to spin around to pass or put up a (job creation) “shot” when opportunity or necessity presents itself. However, he cannot move his pivot foot or else he’ll be called for traveling. The whistle will blow, and he’ll have to turn the ball over.

In the world of business intelligence, online analytical processing (OLAP) and analysis using spreadsheets such as Microsoft Office Excel, “pivot” makes you think of pivot tables, or as Wikipedia defines them:

A pivot table is a data summarization tool found in data visualization programs such as spreadsheets. Pivot tables were created in 1979 by Paul Spinks. Among other functions, they can automatically sort, count, and total the data stored in one table or spreadsheet and create a second table displaying the summarized data. Pivot tables are also useful for creating cross tabs. The user sets up and changes the summary’s structure by dragging and dropping fields graphically. This “rotation” or pivoting of the summary table gives the concept its name. The term pivot table is a generic phrase used by multiple vendors. However, the specific form PivotTable is a trademark of the Microsoft Corporation.

Research surveys often show that spreadsheet users do not make full use of pivot tables because they don’t know how to use them effectively and are afraid of making errors. However, pivot tables are obviously incredibly powerful for seeing data from different perspectives and uncovering patterns that may not have been obvious when analyzing the data in a more limited fashion. Microsoft has introduced PowerPivot for Excel 2010 (and for SharePoint 2010); I found this blog, which does a great job of explaining PowerPivot, so I won’t go into it here. However, I will chime in to say that it is perhaps the most important development in BI this year thus far. PowerPivot begins to bring together the worlds of BI and spreadsheets: or, put differently, it enables users to have some of the major benefits of BI while remaining spreadsheet users.

“Pivot” was an important topic during my briefing last week with Visual Mining, the producer of NetCharts, a tool for developing BI dashboards and data visualization. The focus of the briefing was NetCharts Performance Dashboards V2, which the company says adds “Excel-like” table and reporting functionality. In the demo, I was impressed by the ease and flexibility with which you could work with pivot tables and the data to see different views – and not just simple views but glorious, graphical data visualizations. “We want to enable CFOs to move beyond being record keeping to being more in control,” said Tristan Ziegler, president and CEO. While the challenges faced by the office of Finance are a major focus for Visual Mining, the product is useful for other scenarios, such as contact centers, where there are many data sources and users who may be used to spreadsheets but are not expert data analysts.

Could the Obama Administration’s decision makers benefit from having more views of their data? No doubt. It might help them discover correlations between issues such as job creation and health care that weren’t apparent when they were totally focused on one topic or the other. And I don’t think you can be called for traveling.