Information management capabilities are more important than cheap storage capacity
Ease of storage expansion as well as lower storage costs per TB, combined with the drive to be more security ‘compliant’, threaten to combine to create a perfect data storm. Present conditions seem to encourage regulators and government agencies to insist that public sector institutions as well as corporations collect and retain even more data that is not required for operational purposes, but might be needed in future, or might be needed for public safety, or might aid future issue handling. Corporate governance, risk, compliance (GRC) policies are going in the same direction. The bottom line is: added operational costs. Privacy issues aside, from a cost-benefit perspective two facts spring out: first, some 98% of what is stored is never viewed again, and second information management is way behind the curve. To put it bluntly: garbage in, garbage out (GIGO) is a growing problem because duplication, inconsistencies, randomness as well as systemic errors, lead to massive waste. Policy decisions based on such data risk being flawed and misleading, rather than those based on well-informed analysis of timely and reliable data. Clearly, it’s easier to just add more data to storage than to actually create an information management policy and capability that gives some assurance that data used for decision-making is valid to some defined degree. Continue reading “Stop GIGO Data with Better Information Management”→
The enterprise’s motivation in driving customers to do more via the mobile channel should be to provide a highly differentiated customer experience, enhance the overall lifetime value of the customer to the enterprise, and reduce operational costs.
Due to the growing popularity of mobile phones as a channel of access to customer service centers and the customer’s natural aversion to an IVR interface, it will be essential to allow customers to shift from a mobile, self-service mode to live agent assistance as simply as possible.
Traditional network SLA metrics do not take into account changing IT needs.
Is your network vendor willing to extend service guarantees to application availability?
There are many ways of looking at network service level agreements (SLAs). For telecom providers and certain clients, they can be a mere commercial agreement whereby network downtime will be compensated. In other cases (for example, when downtime can prove very costly or even disastrous to a business), the enterprise customer will need to pay for extra resiliency in the form of five-nines availability or even 100% availability based on 1+1 back-up and/or a 3G wireless broadband data link. Traditional data WAN SLAs still contain the standard metrics, such as jitter, roundtrip delay, latency, availability and MTTR, and this is a good thing overall for making sure the carrier is accountable for the networks. However, IT managers should also be exploring SLAs all the way to applications running on the desktop. Continue reading “Has the 99.999% Availability SLA Gone the Way of the Dodo?”→
The human-centricity of collaboration software invites innovation driven from many disparate industries and technologies.
Natural interface design, information foraging techniques and game theory are set to change the collaboration landscape in 2012.
There are few certainties in this world. And yet, as if bound by a cosmological force greater than the pull of gravity itself, at this time every year, those of us in the technology industry feel an unshakable pull, an unrelenting need to prognosticate. I am in no way immune to such innate stipulations, so it is with great pleasure that I bring to you my own forecast for 2012’s enterprise collaboration platform marketplace. But, just to keep things interesting, what follows are the three most unusual yet profoundly influential trends I think we are likely to see play out this coming year. Continue reading “Unusual Suspects Set to Move the Collaboration Space in 2012”→
While most businesses are still experimenting with on-demand services, Current Analysis research shows the vast majority of those using the cloud today are happy with their experience.
Positive cloud experiences bode well for future growth as market acceptance of cloud services rises and solutions mature.
2011 has been a banner year for IT services. As organizations continue to explore their alternatives to traditional outsourcing models, IT solution providers have been evolving their strategies to support the expected rise in on-demand services. Acquisitions such as CenturyLink’s multi-billion dollar deals for Qwest and Savvis, Dimension Data’s OpSource purchase, and Verizon’s Terremark and CloudSwitch buys, among others, are changing the market landscape for cloud services. Continue reading “2011 in the Cloud: The Year of Managing Expectations”→
Vendors’ predictions are often worth what you pay for them.
Take predictions with a grain of salt.
Does any other market lend itself to self-serving predictions quite as readily as the security market? Don’t get me wrong, I like predictions as much as the next guy; in fact, I have been working on some this week with partner in crime Paula Musich. That said, our predictions do not end with an outright recommendation that you buy our products. Security vendors benefit from often having very good threat research personnel on staff. These teams see more threats and see them sooner than almost anyone else. They are indeed very well positioned to look over the horizon at new attacks that might well go mainstream. However, some security vendors seem to cherry pick threats that align with product suites. (Of course, in a perfect world, vendor threat teams are informing product development decisions.) Tech Target’s Rob Westervelt called McAfee/Intel out on its predictions on Twitter this week. Two of McAfee/Intel’s predictions involved more rootkits and the need for more chip-based security. See what they did there? Continue reading “‘Tis the Season for Predictions”→
Enterprises deploying M2M solutions do not necessarily look to cellular; WiFi, Bluetooth, and Zigbee may be more cost-effective inside buildings, and wireline connections still make sense for many fixed devices.
M2M continues to grow, but the ecosystem has a lot of work to do to reach the huge numbers of predicted devices.
The M2M market is growing; there seems to be no doubt about that. In fact, in comparison with other technology areas, the growth rates seem pretty healthy, with CAGRs of 25% typically reported by carriers and module manufacturers. However, while M2M is often tied to the cellular industry, the reality is that only about 30% of the connections that make up today’s worldwide installed base of M2M devices are cellular. Cellular connection numbers are still small, with an installed base of about 80 million mobile connections today. In addition, M2M bandwidth requirements remain low, and the vast majority of cellular M2M applications today are comfortably served by 2G networks. Continue reading “M2M Growth: Reality Check on the Rise of the Machine”→
Cloud-based applications and controllers reduce complexity and move costs from CapEx to OpEx.
Cloud-based control and management are likely to provide one of the most secure job roles for the next decade.
Since the beginning of the “cloud era,” new use cases for applications have been created nearly overnight. It has evolved the application hosting market and created new IT service juggernauts (Salesforce, Amazon, etc.). However, an area seeing increased attention from both vendors and start-ups/VC is that of hosting infrastructure within the cloud. I am referring to wireless LAN controllers, security gateways, and other technologies that were often appliance-based and located on-premises nearly 100% of the time. By virtualizing the location, IT achieves a range of benefits: fewer assets committed in the data center (if they host offsite), a greatly simplified support model, reduced “truck rolls,” and less hardware required on-premises. Continue reading “How Much IT Can We Host in a Cloud?”→
Service provider consolidation should be good for business customers as providers compete to develop compelling services.
Businesses should proceed with caution if their providers are moving into new service areas.
2011 was a merger-heavy year that also included Windstream’s acquisition of PAETEC, CenturyLink’s purchase of Qwest and Savvis, EarthLink’s purchase of One Communications and Time Warner Cable’s purchase of NaviSite. While each carrier had its own reasons for making its acquisition – increasing size and footprint, and improving financial conditions were two big drivers – these carriers are all interested in moving upmarket, whether to win mid-size businesses or larger enterprises. Enhancing business services and building up the customer base to support this transition were key elements behind these transactions. In many cases, the acquirers were strong in mass market (i.e., residential and very small business) services, a tough market that can have high levels of churn and pricing pressure. Carriers are looking at increased use of mobile devices in the workplace and interest in cloud-based service models as the way to move away from commodity offers to business services, creating a revenue stream with strong growth potential. Windstream and CenturyLink are both moving from their rural local exchange carrier (LEC) roots to be national communications providers. Their respective acquisitions add business customers and services, including hosting, managed services and cloud service options. Similarly, Time Warner Cable’s NaviSite purchase added advanced services that are in demand from larger businesses. EarthLink acquired New Edge Networks in 2006: In 2010-2011 the carrier “took it up a notch”, making a string of purchases (One Communications, Deltacom and STS Telecom) to expand the reach and depth of its network footprint, and to develop managed IT services for multi-site businesses and distributed enterprises. Continue reading “Will Recent Telecom Mergers be a Boon to Enterprises?”→
The lack of broadly accepted cloud standards around security and other key operational points begs for more common definitions across the industry.
One potential source for best practices is the vendors themselves who, through efforts such as certification programs, could help set standards for the cloud.
Anticipating the unknown is a challenge in even the most conventional and mature IT environment. So in the wilds of the cloud, the potential for missteps, breaches and other issues that lead to disappointing outcomes is vast. This uncertainty – and more to the point the lack of control that many IT organizations feel they have over a virtualized on-demand service environment – has kept many enterprises from going all-in to the cloud. Continue reading “Vendor Validation: Certified for Cloud Success?”→