As Principal Analyst for Collaboration and Conferencing at Current Analysis, Brad analyzes the rapidly expanding use of collaboration software and services as a means of improving business agility, fostering employee optimization and driving business opportunities.
• Looking to build good artificiaI intelligence (AI)? Don’t let the speed and availability of open source frameworks, modules, libraries, and languages lull you into a false sense of confidence.
• Good AI needs to start with good data and good data needs to be ingested, registered, described, validated, and processed well before it reaches the ready hands of AI practitioners.
These are heady times. Enterprises have at their disposal both the raw materials and the necessary tools to achieve great things with AI, be that something grandiose as self-driving cars or unassuming as a fraud detection algorithm. The trouble with an abundance of materials (e.g., data) and tools (e.g., open source machine learning models), however, is speed. Speed kills, as they say.
For AI practitioners, this means learning to run before learning to walk by hastily automating decisions via AI models that are built on unsound data. With a few simple open source frameworks, modules, libraries, and languages, seemingly useful but ultimately erroneous predictions and conclusions can be readily drawn from any old data set in very short order. What’s the answer? More or better tools? No. As with most human problems, good old human knowhow and understanding are necessary. And that begins with data.
At its annual user conference, Tableau rolled out several data prep and management capabilities, highlighted by the ability for Tableau Prep Builder users to write out to third-party data stores, such as the popular Snowflake solution.
This, along with several ongoing cloud and database initiatives, marks a significant philosophical shift for the vendor away from pure analytics and toward a more complete solution to help buyers establish a company-wide data culture.
For a company powered by analytics, Tableau put very few on display during its annual user conference in Las Vegas last week. Certainly, there were numerous stats to be found, particularly relating to the adoption of Tableau Online, where there are now 15,000+ active customer accounts. What’s more, Tableau Online is maintaining 100% YoY growth, as reported by CEO Adam Selipsky – a fitting fact given that Tableau and its new, cloud-first parent company, Salesforce, are now free to talk integration and rationalization. Continue reading “Tableau Tackles Analytics at Scale, Not Through Tech Alone but with a ‘Culture of Data’”→
Just one short year after an internal reorganization to more fully meld artificial intelligence (AI) with data and analytics, IBM is back with a new, more accessible vision for IBM Watson.
This time around, the company isn’t focused on game shows or scientific discovery but instead on solving very basic, often human-centric challenges.
When it comes to chasing the market’s heated but somewhat unrequited love affair with AI, IBM has certainly done its part in terms of generating hype for its multi-billion dollar investment in IBM Watson. That hype, which has taken aim at some rather lofty goals such as identifying and diagnosing cancer, has not fully panned out, with some early adopters scaling back or halting operations altogether due to concerns over cost and efficacy. Continue reading “IBM Data and AI Forum: Say Hello to a More Accessible IBM Watson”→
Though unacknowledged, Huawei responded to growing global criticisms at its annual Connect Conference in Shanghai by introducing several new products in support of its full-stack, all-scenario AI portfolio.
With several new solutions spanning AI training and algorithm execution hardware, cloud services, open source projects, and ecosystem investments, Huawei intends to build a vibrant, sizable, and influential ecosystem of partners.
Huawei may be facing a global and escalating chorus of scrutiny, criticism, and outright censure right now over whether or not enterprise buyers should trust the Chinese technology giant. But, here in Shanghai, China at Huawei’s Connect 2019 conference, the skies are blue, the temperature is temperate, and the trees appear ready to don their glorious autumnal colors at any moment. This was the sentiment – only slightly paraphrased – delivered as a response to these challenges by Huawei’s Deputy Chairman and Rotating CEO, Ken Hu, during the opening keynote on Wednesday. Continue reading “Huawei Connect 2019: When It Comes to Combating Global Politics, Huawei Is Taking the Long View with Its AI Portfolio”→
• These days, everyone is doing containerization in a mad, industry-wide rush toward what appears to be true cross-cloud compatibility.
• However, enterprise buyers need to be aware that when it comes to containerization and microservices, there’s a huge difference between compatibility and capability.
Back in 1964, media futurist Marshall McLuhan penned the often repeated but somewhat baffling phrase, “the medium is the message,” in an attempt to highlight the importance of the “where and how” of storytelling. To Mr. McLuhan, a film, a novel, and a comic may all tell the exact same story about a boy and his dragon, but importantly each would do so using very different conventions regarding the unfolding of the story, let’s say the manner in which each handles flashbacks. Those differences in turn shape our understanding of the story in unique ways.
Flash forward to the present and among technology providers, particularly those endeavoring to make the architectural leap from premises to cloud, Mr. McLuhan’s more than 55 year old notion seems strangely applicable if not downright prophetic. Let me explain: as a global market trend, the idea of abstraction through containerization technologies like Docker has entirely reshaped the global software landscape, forever altering the way developers create software. In short, abstractions allows developers to write once and run “virtually” anywhere by turning monolithic applications into a series of highly standardized yet extremely malleable microservices. Continue reading “In a Containerized World, Does the Cloud Really Matter Anymore?”→
Starting on September 1, 2019, Microsoft will begin onboarding new Office 365 users directly into Microsoft Teams, in essence removing the option for customers to run both Teams and the soon-to-be-retired Skype for Business Online.
Though somewhat extreme, this migration plan has been coming on for some time now, frankly ever since Microsoft introduced Microsoft Teams in 2017.
• Late last week AT&T and Samsung together cut the ribbon on a co-developed 5G Innovation Zone that had nothing at all to do with consumer 5G future opportunities.
• Rather, the new facility, housed within Samsung Austin Semiconductor’s Austin Texas fabrication plant, showcased several ways high speed cellular can both modernize and optimize manufacturing processes.
If you travel a few miles northeast of Austin, Texas, you’ll find among the gentle rolling hills an undistinguished 300-acre facility dedicated to the fabrication of semiconductors (aka computer chips) for networking, high performance computing, IoT, and of course mobile devices. And if you look carefully within the foyer of this 20+ year old foundry, you’ll find a somewhat unassuming highly rectangular room peppered with Ikea-styled demonstration tables and plain black monitors that when considered together scream out in all caps: “5G IS VERY REAL, RIGHT NOW!”
• There has been a significant rush among technology providers to make artificial intelligence (AI) a self-service endeavor, to make it available to the broadest possible swath of business users.
• But in so doing, companies are creating unanticipated legal exposure for AI practitioners unprepared to protect AI from human bias.
Salesforce.com has added a new AI learning module to its Trailhead developer education platform with an interesting twist. Rather than teach developers how to build AI outcomes most efficiently, the company’s newest educational module asks that practitioners slow down and focus on creating ethically informed AI solutions.
The new Trailhead educational module entitled, “Responsible Creation of Artificial Intelligence,” calls attention to an often overlooked threat from AI, namely unwitting human biases and intentional human prejudices.
Within these new training materials, Salesforce.com calls on Salesforce.com Einstein developers to adopt its own set of core values of “trust, customer success, innovation, and equality.” The company goes so far as to suggest that developers who fail to adhere to these standards in creating AI algorithms may find themselves in breach of its acceptable use policy.
Why is Salesforce.com referencing an acceptable use policy in conjunction with the ethical use of AI? Surely companies not engaged in outright nefarious endeavors would steer clear of anything overtly illegal in building AI outcomes. Certainly legislative controls such as GDPR and the California Consumer Privacy Act (CCPA) are very clear about what constitutes an unlawful use of consumer data. Companies need only adhere to such policies to avoid potential litigation or censure, right?
• In order to do AI, IoT, and other big data-dependent projects right, companies are beyond the confines of traditional relational databases.
• Two recent, related partnerships between highly specialized “graph” database developers, Neo4J and TigerGraph, and public cloud platform providers, Amazon and Google, underscores the importance surfacing insights that would otherwise remain hidden within traditional database architectures.
Organizations anxious to put AI to work as a means of driving innovation must first invest in big data. AI algorithms and predictive models are nothing without a constant influx of high quality data. The trouble is that not all data is created equal, at least in terms of its ability to match the demands of a given initiative, be that AI, IoT, mobility, or edge computing.
Such specific demands in turn drive the adoption of highly specialized data architecture, extending down to the database itself. There are traditional relational databases as well as those specializing in key-values, document storage, in-memory processing, time-series evaluation, transaction ledgers, and graph analysis. Each in turn solves very specific problems – e.g., self-driving cars won’t work without an underlying database capable of performing time-series analysis. Continue reading “Graph DB Makers Neo4J and TigerGraph Explore Bring Your Own Database Cloud Options”→