id
int64 0
17.2k
| year
int64 2k
2.02k
| title
stringlengths 7
208
| url
stringlengths 20
263
| text
stringlengths 852
324k
|
---|---|---|---|---|
15,667 | 2,019 |
"Google Cloud announces 7 open source partners, Seoul and Salt Lake City regions | VentureBeat"
|
"https://venturebeat.com/2019/04/09/google-cloud-announces-7-open-source-partners-seoul-and-salt-lake-city-regions"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Google Cloud announces 7 open source partners, Seoul and Salt Lake City regions Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
At Google Cloud Next 2019 today, Google signed strategic partnerships with seven “leading open source-centric companies”: Confluent , DataStax , Elastic , InfluxData , MongoDB , Neo4j , and Redis Labs.
Each partnership has its own timeline, but the company expects them all to roll out “over the next few months.” Google also announced new Google Cloud Platform (GCP) regions opening in 2020: Seoul and Salt Lake City.
Google says it believes in an “ open cloud ” and that “open source is the future of public cloud.” Indeed, the company has made open source contributions via projects like Kubernetes, TensorFlow, Go, and so on. But Google doesn’t want to just open-source its own projects — it also wants to partner with companies that are building open source projects.
Google will be offering fully managed services for these partners that are tightly integrated into GCP. That means a single user interface (including the ability to provision and manage the service from the Google Cloud Console), unified billing (one invoice from Google Cloud that includes the partners’ services), and Google Cloud support (manage and log support tickets in a single window). Google will also work with its partners to build integrations with native GCP services like Stackdriver and IAM, validating for security and optimizing performance.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Data management and analytics partnerships Google hopes that this will make it easier for enterprise customers to build on open source technologies. The seven partnerships are all data management and analytics company — here is how Google describes them: Confluent : Founded by the team that built Apache Kafka, Confluent builds an event streaming platform that lets companies easily access data as real-time streams.
DataStax : Powers enterprises with its always-on, distributed cloud database built on Apache Cassandra and designed for hybrid cloud.
Elastic : As the creators of the Elastic Stack, Elastic builds self-managed and SaaS offerings that make data usable in real time and at scale for search use cases, like logging, security, and analytics.
InfluxData : Its time series platform can instrument, observe, learn and automate any system, application, and business process across a variety of use cases. InfluxDB is an open-source time series database optimized for fast, high-availability storage and retrieval of time series data in fields such as operations monitoring, application metrics, IoT sensor data, and real-time analytics.
MongoDB : A modern, general-purpose database platform that brings software and data to developers and the applications they build, with a flexible model and control over data location.
Neo4j is a native graph database platform specifically optimized to map, store, and traverse networks of highly connected data to reveal invisible contexts and hidden relationships. By analyzing data points and the connections between them, Neo4j powers real-time applications.
Redis Labs : The home of Redis, the world’s most popular in-memory database, and commercial provider of Redis Enterprise. It offers performance, reliability, and flexibility for personalization, machine learning, IoT, search, e-commerce, social, and metering solutions worldwide.
Google says its customers regularly ask to use open source technology in a cloud-native way. By offering a similar experience to its native GCP services, the company is delivering on that request at scale.
New cloud regions in 2020 Google is already planning to open an Osaka, Japan region in the coming weeks and last month announced a Jakarta, Indonesia region will launch in the first half of 2020. But this is Google Cloud Next, where the company always has more regions to announce.
Google will be adding cloud regions in Seoul, South Korea (its eighth in Asia Pacific) and Salt Lake City, Utah (its sixth in the U.S.) in 2020. The company didn’t give exact dates, merely saying that the Seoul region will be usable “in early 2020” followed by the Salt Lake City region “shortly thereafter.” Each new cloud region will offer three zones at launch and will include key GCP products.
Google Cloud Platform is currently in 19 regions ( cloud locations ). If you add the announced regions, the number grows to 23. Oh, and Google promised there will be “more region announcements” this year.
The big three cloud providers count regions differently, so they’re not easy to compare. But for the record, Amazon currently advertises 20 AWS regions and Microsoft markets 54 Azure regions.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,668 | 2,021 |
"Database trends: The rise of the time-series database | VentureBeat"
|
"https://venturebeat.com/2021/01/15/database-trends-the-rise-of-the-time-series-database"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Database trends: The rise of the time-series database Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The problem: Your mobile app just went viral, and you’ve got a boatload of new users flooding your servers with a bazillion packets of data. How can you store this firehose of sensor data? Is there a way to deliver some value with statistical analysis? Can you do this all on a budget with a well-tuned database that won’t drive the price of supporting the widget through the roof? The time-series database (TSDB) is designed to handle these endless streams, and it’s one of the most notable current trends in database technology.
It gives developers a tool for tracking the bits flowing from highly interactive websites and devices connected to the internet. It adds strong algorithms for fast queries for statistical analysis, which makes it popular for tackling problems like online ad selection and smart device support.
The TSDB has grown in popularity in recent years, and last year it was the fastest-growing type of database in the enterprise , largely because of the growing number of use cases for it. After all, time-series data is a sequence of data points collected over time, giving you the ability to track changes over that period — and that’s what you need to do if you’re running sophisticated transactions like advertising, ecommerce, supply chain management, and more.
What are some other major use cases for a TSDB? VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Appliance makers are adding internet connections to add a bit of zip to their product lines, and now these devices are all phoning home to report data so the customers can manage them from their phone anywhere and anytime.
Mobility is becoming an extension of the cloud. The rent-by-minute scooters and ride-sharing platforms track users before, during, and after the ride. All of these data points can be studied to improve performance and plan deployments for future demands.
Many documents are slowly turning from a single block of data into a stream of changes. Word processors that used to store the current version of a document are now recording every keystroke and mouse click that produced them. This makes editing simpler, with infinite levels of “undo” available.
Houses are becoming more digital, and many items that were once little more than a switch (e.g., a thermostat, lamp, or television) are now recording events every second or even more often.
What makes a TSDB shine? First, datasets are large and getting larger. Log files are measured in petabytes now, and they’re growing. Devices from the so-called internet of things (IoT) are proliferating, and they’re often designed to rely on a central service for analysis and presentation. Sense.com, for instance, collects information on electrical consumption in houses millions of times per second. When these bits are reported, Sense.com’s central database must store enough data to be useful but not enough to overwhelm the storage.
The time-series datasets often have fewer relationships between data entries in different tables that require transaction-based locking to avoid inconsistencies. Most of the data packets contain a timestamp, several sensor readings, and not much more.
This allows special indices to speed queries like the number of events in a day, week, or other time period. Good time-series indices can offer quick answers to statistical questions about ranges of data.
The databases can also offer some support because many of the maintenance chores are regular and easy to automate. The databases can automatically dispose of old data while delivering only fresh statistics. While standard databases are designed to store data forever, time-series databases can be configured to give data elements a specific time to live. Others will use a round-robin algorithm to store a fixed set.
As time goes by, the databases deploy specialized compression functions that will store time-series data in less space. If sensor readings don’t change from millisecond to millisecond, there’s no reason to store another copy of the same value. Timescale.com, for instance, boasts of 94%-97% saving in storage thanks to compression algorithms tuned to the regular data patterns.
Who benefits the most? Tracking how people, machines, and organizations behave over time is the key to customization. Time-series databases that optimize the collection and analysis of time-series data open up the opportunity to provide business models that adjust and avoid one-size-fits-all standardization. Algorithms that place advertising, for instance, can look at recent behavior. Intelligent devices like thermostats can search through events and understand what people want at different times of the day.
How are legacy players approaching it? All major databases have long had fields that store dates and times. All of the traditional queries for searching or tabulating the data still work with these entries. Oracle databases, for example, have been popular on Wall Street for storing regular price quotes. They aren’t optimized like the new databases, but that doesn’t mean that they can’t answer the questions with a bit more computational power. Sometimes it’s cheaper to buy bigger machines than switch to a new database.
Some applications may collect a variety of data values, and some may be best suited to the stability of a traditional database. Banking applications, for instance, are filled with ledger transactions that are just time-series tables of the total deposits. Still, bank developers can be some of the most conservative, and they may prefer a legacy database with a long history over a new tool with better efficiencies.
Sometimes the traditional companies are rolling out newer models that compete. Oracle, for instance, is also tuning its NoSQL database to search and analyze the time-series data streams from sensors and other real-time sources. The API will maintain a running collection of fresh data points and enforce time-to-life control over the data to avoid overloading the storage.
The newer data analysis engines often include tools specifically built for time-series data. For example, Microsoft’s Data Mining tool for its SQL Server has a collection of functions that can look at historical data and predict future trends.
The cloud companies are also adding data storage services for this market. AWS, for example, launched its Timestream service, a tool optimized for IoT data. It will also integrate with the rest of the AWS stack through standard pathways like the Lambda functions, as well as customized ones for machine learning options like SageMaker.
Which new startups are emerging? New companies see an opportunity through focusing on adding the right amount of indexing and post-processing to make queries fast and effective.
InfluxDB began as an open source project and is now available as either a standalone installation or an elastic serverless option from the InfluxDB Cloud. The company’s Flux query language simplifies tasks like computing the moving averages of the data stream. The language is functional and designed to be easily composable so queries can be built up from other queries.
Timescale DB is a separate engine that is fully integrated with PostgreSQL for tasks that might need traditional relational tables and time-series data. The company’s benchmarks boast of speeding up ingesting data by a factor of 20. The queries for searching the data or identifying significant values like maxima can be thousands of times faster.
Prometheus stores all data with a timestamp automatically and provides a set of standard queries for analyzing changes in the data. Its PromQL bears some resemblance to the emerging data format for queries, GraphQL. This makes it simple for developers to set up alerts that could be triggered by data anomalies.
Redis created a special module for ingesting the rapid data flows into the database. The indexing routines build a set of average statistics about the data’s evolution. To save memory, it can also downsample or aggregate the elements.
Kdb+ , a database that’s the foundation of the Kx platform, maintains a connection with relational databases that makes it simpler to work with some of the relational schema that dominate some applications. The streaming analytics built by the database offer both traditional statistics and also some machine learning algorithms.
What’s next? Open source projects and startups have many of the same goals as other tech projects. They all want to find ways to handle bigger data streams with more complicated analytics that are run in more efficient silos — bigger, faster, smarter, and cheaper.
Beyond that, groups are starting to think about the long-term custodial responsibilities that the endless streams might require. The Whisper open source database , for instance, is designed to gracefully turn high-resolution data that might be compiled from a rapid stream into a lower-resolution, historical summary that can be stored and studied more efficiently over time. The goal is to save space while still providing useful summaries. The database is, in essence, deliberately saving summaries and disposing of the information that was originally entrusted to it.
The companies are debating the language used by developers to write queries.
QuestDB is revisiting and extending SQL by adding features for grouping and analyzing data by time. It believes that SQL is a language that will live on, in part because so many DBAs know it.
Other companies are building specialized languages that are closer to functional programming languages. For example, InfluxDB’s Flux language encourages developers to compose their solutions out of multiple smaller, reusable functions.
The companies will also be pushing to extend the presentation layer. Many of the databases are already loosely coupled with graphical dashboards like Grafana.
These connections will grow deeper, and in many cases the tools will effectively merge with the time-series database.
Matomo , for instance, is presented as a product for tracking visitors to websites.
Is there anything a TSDB can’t do? In a sense, all databases are time-series databases because they maintain a log of the transactions that build up the table. The real question is which applications need to track how data changes over time. Many traditional databases were concerned only with the current state. They tracked, for instance, how many empty seats were left on the airplane. That’s the most important detail for selling tickets.
But sometimes there are hidden opportunities in even these applications. For instance, tracking when the tickets are sold can help pricing strategies in the future because airlines can know whether demand is running ahead or behind historical norms. In this sense, even traditional applications that don’t seem to need to track changes over time might be improved. The time-series databases might just be an opportunity.
This article is part of a series on enterprise database technology trends.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,669 | 2,013 |
"Google + Facebook = 70 percent of all mobile ad revenues worldwide | VentureBeat"
|
"https://venturebeat.com/2020/06/03/google-face"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Google + Facebook = 70 percent of all mobile ad revenues worldwide Share on Facebook Share on X Share on LinkedIn The requisite Google couch.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
By itself, Google accounts for 56 percent of all global mobile ad revenues. Social giant Facebook takes a much smaller chunk with 13 percent. But together, they own the lion’s share of mobile ad dollars.
Advertisers spent $8.8 billion hawking their wares via mobile ads last year, according to new numbers from eMarketer.
Of that, Google raked in $4.61 billion — triple what it made in 2011. And eMarketer estimates that Google will almost double mobile revenues in 2013 to a staggering $8.85 billion. While Facebook’s numbers are much smaller, the company is also much younger and much newer to the mobile ad marketplace. As of just two years ago, Facebook had no mobile ad revenue at all.
Zilch. Nada.
Now Facebook is projected to bring in just over $2 billion in mobile ads, up a massive 333 percent from the $470 million it earned in 2012. In comparison, fellow monetization newcomer Twitter brought in $140 million in 2012 and is projected to double that to $310 million in 2013.
Whichever way you slice, the mobile ad pie is growing fast, and relatively new companies — even Google is just 15-years-old — are taking bigger and bigger swaths.
In terms of total digital ad spend, Google is also the unquestioned leader.
One out of every three dollars spent on digital ads is spent with Google. The search-plus-mobile-plus-designer-eyewear-plus-advertising giant (plus a little bit of everything else) took home $32.73 billion in digital ad revenues in 2012. That’s in a market with a total size of $104 billion.
In 2013, eMarketer estimates that Google will grow dollars and share to $39 billion and 33 percent.
Facebook will also grow by about 20 percent from about $4.3 billion in total digital ad revenue in 2012 to almost $6 billion, taking in 5 percent of the overall digital pie. Yahoo, Microsoft, Barry Dillers InterActiveCorp, AOL, Amazon, Pandora, Twitter, and LinkedIn will fill out the top 10.
Interestingly, however, when you look at the fastest growers, the leader board reverses.
Twitter, which was the fastest-growing in 2012, will also grow the most in 2013, eMarketer says, by just over 100 percent each year. Internet radio leader Pandora and professional networking giant LinkedIn will both grow at or about 50 percent while Google will only grow by 18.6 percent.
Naturally, that’s due to the law of large numbers: It’s hard to keep doubling when you get to revenues that start with a capital B.
But it’s also a sign that the young companies are strong and growing.
Image credit: John Koetsier VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,670 | 2,021 |
"5 ways to finally fix data privacy in America | VentureBeat"
|
"https://venturebeat.com/2021/01/29/5-ways-to-finally-fix-data-privacy-in-america"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest 5 ways to finally fix data privacy in America Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
As a new administration enters the White House, we have the chance to finally fix privacy in America. Short of passing a national privacy law (which the majority of Americans want), we need action on data privacy. We need changes enacted swiftly and without delay. Both consumers and businesses deserve consistency and clarity.
As business leaders, we must put our customers first. We should be customer-centric in our thinking and unwavering in our support for greater controls that deflate the “us vs. them” mentality pervasive in privacy. Managing privacy shouldn’t be complicated and confusing. It should be as simple and straightforward as reasonably possible.
In that spirit, here are five ideas to make privacy work for consumers and businesses. A couple of them could likely even be mandated by Executive Order. One thing is sure: American companies are creative and innovative. We can protect an individual’s right to privacy while still delivering services and products that consumers want — at a profitable price. What are we waiting for? 1. Make privacy opt-in It’s a sad state of affairs when companies have to trick you into agreeing to share your personal information to use their services or gain access to a website. Nearly everywhere you look, you have to opt-out of data sharing. Flip that on its head: that means that privacy is an opt-in experience; you have to “opt-in” to protect your privacy because you have no default assumption of data privacy.
And you have to declare your privacy preferences with every single company. You have to tell each company that you want data privacy by telling them how to use your data. In essence, you’re opting into privacy by opting out of their data sharing. This needs to change. Data privacy should be the default. The onus is then on companies to show the benefits of data sharing so that consumers can actively choose to opt in to share their personal information.
2. Require plain English privacy policies Privacy policies are dense documents thick with legalese. They’re so hard to understand that few people actually read them and so long that it would take 76 working days to read the policies you encounter in a single year. Incredible.
How in the world are we putting data privacy on the shoulders of consumers when it’s the companies that are getting the most benefit from invading our data privacy? They should be honest and transparent; we shouldn’t need lawyers to understand what we’re consenting to.
Companies should have privacy policies that are written plainly in easy-to-understand language. Businesses can put these policies into centralized privacy hubs. These hubs show users how their data is collected, stored, and used, as well as one-click privacy controls to manage their consent. Plain and simple language, with easy navigation in one location — that’s the answer.
3. Mandate privacy labels Apple is absolutely on the right track here. The company’s requirement for app developers to clearly define how apps use data is a watershed moment for privacy in America. However, when one company acts alone, it doesn’t create a shared environment of trust. Even if other companies follow suit, we will only have a patchwork of privacy labels that is equally dense as our current system. Instead, we should mandate privacy labels just like we do nutrition labels on foods.
Every company should explain its data usage with privacy labels that are consistent in content and conspicuous in placement — as in, the labels have the same layout and are easily located. When you flip over a product in the supermarket, you know what you’re going to find and where. Privacy should be the same: You should know what you’re signing up for in a consistent way across services.
4. Give data an expiration date What if we simply required companies to allow each of us to set personal limits on data storage and usage? We could refine our data privacy settings in a more granular way, controlling our data destiny by deciding what data specific companies can use and for how long. Google has already started this in some products; all companies should follow suit.
If all data had an expiration date, it would prevent algorithms from using that data after the consumer has requested its deletion. Think about it: Even if you ask a company to stop using your data, it likely lives on in black-box algorithms. If data had an expiration date, it would rebalance the power away from the algorithms and towards humans.
5. Make protecting data cheaper than abusing it Data needs to be protected, plain and simple. When the FTC settled with Flo, the fertility app accused of misleading consumers about data usage, it highlighted what we’d known all along: Many companies, especially health and fitness trackers , know more about us than we know about ourselves.
And yet we have no idea how well businesses are protecting our personal data. Europe’s privacy law, the GDPR, requires data protection as a default — and the law makes non-compliance costly. Fines range from 10 million euros or 2% of worldwide annual revenue to 20 million euros or 4% of revenue. Those fines have also increased 40% year-over-year. While the U.S.
fined Facebook $5 billion for abusing customer data in 2019 (a record fine), we need a consistent penalty framework that makes privacy protection less costly than privacy violation.
Companies should rightly be penalized when they violate our trust. We must align the disincentives with the externalities caused by privacy abuses. When it’s less expensive to pay fines than implement sound privacy practices, we have a serious problem.
To fix privacy in America, we have to shift the burden of privacy management from consumers to companies. Privacy is a human right and should be a de facto facet of the internet — not something that we have to fight for at every turn of our online journeys. Privacy protection should be a mandatory part of doing business in America — not an optional afterthought.
Harry Maugans is the CEO of Privacy Bee.
His vision for the future of privacy is a world in which consumers have total transparency and control over their data footprints. He’s contributed to HackerNoon, AllBusiness, IB Times and ReadWrite.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,671 | 2,020 |
"Blue Prism raises over $120 million to bolster its robotic process automation suite | VentureBeat"
|
"https://venturebeat.com/2020/04/21/blue-prism-raises-over-120-million-to-bolster-its-robotic-process-automation-suite"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Blue Prism raises over $120 million to bolster its robotic process automation suite Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
In a sign of the robotic process automation market’s continued strength in the face of an economic downturn, Blue Prism today announced that it has raised £100 million ($124 million) in equity financing at a valuation of around £1 billion ($1.24 billion). Chair and CEO Jason Kingdon says the fresh capital will be used to strengthen Blue Prism’s balance sheet while allowing investment in the company’s automation suite.
Robotic process automation — an industry that’s anticipated to be worth $10.7 billion by 2027, according to Grand View Research — is a form of workflow automation technology that taps AI to tackle digital tasks previously performed by humans. It’s recently come to the fore in light of the coronavirus pandemic — last month, Blue Prism launched a COVID-19 response task team, which worked with the U.K.’s National Health Service; the University of California, San Francisco; and the Leeds Building Society to automate HR, personnel, finance, vaccine development, and other health care support functions.
“In this environment, our [RPA solution is] arguably more important than ever in driving organizational adaptation and resilience, and our role as a strategic technology partner to our customers in many ways becomes more vital,” Kingdon told VentureBeat via email. “The duration and impact of this pandemic are at this stage unknown, and as a result we are taking action to invest and reinforce our product differentiation in preparation for the opportunities [that] will occur both in the short- and longer-term.” Blue Prism was founded in 2001 by a group of automation experts to develop tech that could be used to improve organizational efficiency. In 2003, their first commercial product — Automate — launched in general availability, and in 2016 Blue Prism became a publicly traded company with a listing on the London Stock Exchange.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Blue Prism’s eponymous platform, which is built on Microsoft .NET, automates virtually any app on any platform — including mainframes, Windows machines, and the web — in environments ranging from terminal emulators to web browsers and services. It’s designed for multi-environment deployment models with both physical and logical access controls, with a centralized release management interface and process change distribution framework that provide a level of visibility.
The Blue Prism platform records system logins, changes in management, decisions, and actions taken by its software robots to identify statistics and operational analytics. It supports regulatory, security, and governance contexts such as PCI-DSS, HIPAA, and SOX, and its process coding is automated on the backend to allow users to program processes using a drag-and-drop interface.
Blue Prism customers gain access to a reusable library of processes and objects for developing automations. It’s also scalable — the company says its top 50 clients use 500 software robots, on average. Blue Prism’s Process Discovery module takes snapshots of work queues at defined periods to gather activity and metrics and collates everything in a shareable dashboard. And its Decipher module, which recently exited beta, ingests, extracts, and transforms data from documents like vendor contracts, claims forms, emails, spreadsheets, purchase orders, and field reports.
Enterprise customers gain access to Blue Prism Cloud, a fully integrated, software-as-a-service offering with pre-integrated skills. From the Blue Prism Cloud Hub, they’re able to access a window with live feed overviews and environment-specific analytics for utilization, process completion times, and performance against service-level agreements. Enterprise customers also get access to Wireframer, an automation builder that prepopulates automations and helps reduce design time by a claimed 70%, as well as Blue Prism Cloud IADA, which leverages AI to adjust on-premises and cloud resource utilization by taking into account network infrastructure and app performance.
Last year, Blue Prism launched a new AI engine, an updated marketplace for extensions (Blue Prism Digital Exchange), and a lab for AI research and development (Blue Prism Labs). Connectors to AI tools from Amazon, Google, and IBM joined marketplace tools that gave partners and customers the ability to create, share, and deploy plugins for the Blue Prism platform.
Blue Prism competes against heavyweights like UiPath, which in April nabbed $568 million at a $7 billion valuation for its suite of AI-imbued process automation tools, and Automation Anywhere, which raised $290 million at a $6.8 billion valuation. Elsewhere, Kryon secured $40 million, Softmotive pulled together a $25 million tranche from a host of investors, and Automation Hero secured $14.5 million.
But Blue Prism claims it has a leg up with respect to success and renewal rate. The company reports that 96% of customers opt to re-up service and that 90% of its certified partners report that they’re satisfied with the platform.
The funding — which brings Blue Prism’s total raised to nearly $200 million — comes after it achieved the fastest revenue growth of all large U.K. public software companies for the fourth consecutive year in 2019. This included an 83% increase in revenue to £101 million ($125.74 million) in the first half of 2019 and £137 million ($170 million) today. During that same time frame, Blue Prism’s customer base grew to 1,677 enterprise accounts as it added 700 new clients. (As of today, Blue Prism has 1,819 customers, including Microsoft, Accenture, Google, IBM, Heineken, and Jaguar Land Rover.) In addition to its headquarters in Warrington, Blue Prism has offices in London, Austin, Sydney, Paris, Munich, and Washington. It employs just over 1,000 people.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,672 | 2,021 |
"IBM bets homomorphic encryption is ready to deliver stronger data security for early adopters | VentureBeat"
|
"https://venturebeat.com/2021/04/03/ibm-bets-homomorphic-encryption-is-ready-to-deliver-stronger-data-security-for-early-adopters"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages IBM bets homomorphic encryption is ready to deliver stronger data security for early adopters Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The topics of security and data have become almost inseparable as enterprises move more workloads to the cloud. But unlocking new uses for that data, particularly driving richer AI and machine learning, will require next-generation security.
To that end, companies have been developing confidential computing to allow data to remain encrypted while it is being processed. But as a complement to that, a security process known as fully homomorphic encryption is now on the verge of making its way out of the labs and into the hands of early adopters after a long gestation period.
Researchers like homomorphic encryption because it provides a certain type of security that can follow the data throughout its journey across systems. In contrast, confidential computing tends to be more reliant upon special hardware that can be powerful but is also limiting in some respects.
Companies such as Microsoft and Intel have been big proponents of homomorphic encryption.
Last December, IBM made a splash when it released its first homomorphic encryption services.
That package included educational material, support, and prototyping environments for companies that want to experiment.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! In a recent media presentation on the future of cryptography, IBM director of strategy and emerging technology Eric Maass explained why the company is so bullish on “fully homomorphic encryption” (FHE).
“FHE is a unique form of encryption, and it’s going to allow us to compute upon data that’s still in an encrypted state,” Maass said.
Evolving encryption First, some context. There are three general categories of encryption. The two classic ones are encryption for when data is at rest or stored and then “data in transit” that protects the confidentiality of data as it’s being transmitted over a network.
The third one is the piece that has been missing: the ability to compute on that data while it’s still encrypted.
That last one is key to unlocking all sorts of new use cases. That’s because until now, for someone to process that data, it would have to be unencrypted, which creates a window of vulnerability. That makes companies reluctant to share highly sensitive data involving finance or health.
“With FHE, the ability to actually keep the data encrypted and never exposing it during the computation process, this has been somewhat akin to a missing leg in a three-legged crypto stool,” Maass said. “We’ve had the ability to encrypt the data at rest and in transit, but we have not historically had the ability to keep the data encrypted while it’s being utilized.” With FHE, the data can remain encrypted while being used by an application. Imagine, for instance, a navigation app on a phone that can give directions without actually being able to see any personal information or location.
Companies are potentially interested in FHE because it would allow them to apply AI to data, such as from finance and health, while being able to promise users that the company has no way to actually view or access the underlying data.
While the concept of homomorphic encryption has been of interest for decades, the problem is that FHE has taken a huge amount of compute power, so much so that it has been too expensive to be practicable.
But researchers have made big advances in recent years.
For instance, Maass noted that in 2011, it took 30 minutes to process a single bit using FHE. By 2015, researchers could compare two entire human genomes using FHE in less than an hour.
“IBM has been working on FHE for more than a decade, and we’re finally reaching an apex where we believe this is ready for clients to begin adopting in a more widespread manner,” Maass said. “And that becomes the next challenge: widespread adoption. There are currently very few organizations here that have the skills and expertise to use FHE.” FHE is ready for its close-up During the presentation, AI security group manager Omri Soceanu ran an FHE simulation involving health data being transferred to a hospital. In this scenario, an AI algorithm was used to analyze DNA for genetic issues that may reveal risks for prior medical conditions.
That patient data would typically have to be decrypted first, which could raise both regulatory and privacy issues. But with FHE, it remains encrypted, thus avoiding those issues. In this case, the data is sent encrypted and remains so while being analyzed, and the results are also returned in an encrypted state.
It’s important to note that this system was put in place using just a dozen lines of code, a big reduction from the hundreds of lines of code that have been required until recently. By reducing that complexity, IBM wants to make FHE more accessible to teams that don’t necessarily have cryptography expertise.
Finally, Soceanu explained that the simulation was completed in .069 seconds. Just five years ago, the same simulation took a few hours, he said.
“Working on FHE, we wanted to allow our customers to take advantage of all the benefits of working in the cloud while adhering to different privacy regulations and concerns,” he said. “What only a few years ago was only theoretically possible is becoming a reality. Our goal is to make this transition as seamless as possible, improving performance and allowing data scientists and developers, without any crypto skills, a frictionless move to analytics over encrypted data.” Next steps To accelerate that development, IBM Research has released open source toolkits, while IBM Security launched its first commercial FHE service in December.
“This is aimed at helping our clients start to begin to prototype and experiment with fully homomorphic encryption with two primary goals,” Maass said. “First, getting our clients educated on how to build FHE-enabled applications and then giving them the tools and hosting environments in order to run those types of applications.” Maass said in the near term, IBM envisions FHE being attractive to highly regulated industries, such as financial services and health care.
“They have both the need to unlock the value of that data, but also face extreme pressures to secure and preserve the privacy of the data that they’re computing upon,” he said.
But he expects that over time a wider range of businesses will benefit from FHE. Many sectors want to improve their use of data, which is becoming a competitive differentiator. That includes using FHE to help drive new forms of collaboration and monetization. As this happens, IBM hopes these new security models will drive wider enterprise adoption of hybrid cloud platforms.
The company sees a day, for instance, when due diligence for mergers and acquisitions is done online without violating the privacy of shareholders and when airlines, hotels, and restaurants use FHE to offer packages and promotions without giving their partners access to details of closely held customer datasets.
“FHE will allow us to secure that type of collaboration, extracting the value of the data while still preserving the privacy of it,” Maass concluded.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,673 | 2,019 |
"Microsoft combines AI and humans to boost cloud security with Azure Sentinel and Threat Experts | VentureBeat"
|
"https://venturebeat.com/2019/02/28/microsoft-combines-ai-and-humans-to-boost-cloud-security-with-azure-sentinel-and-threat-experts"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft combines AI and humans to boost cloud security with Azure Sentinel and Threat Experts Share on Facebook Share on X Share on LinkedIn Azure Sentinel Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
With a growing number of high-profile data breaches emerging across all industries, companies are scrambling to shore up their defenses. However, some reports indicate anticipate a cybersecurity workforce shortfall of more than 3 million people by 2021.
Against that backdrop, artificial intelligence (AI) could prove pivotal in helping firms of all sizes protect themselves from outside threats.
Microsoft is today rolling out a couple of new cloud-based cybersecurity tools to help security teams by “reducing the noise” and “time-consuming tasks and complexity” involved in constantly monitoring for cyberattacks, Ann Johnson, Microsoft’s corporate vice president for cybersecurity, wrote in a blog post.
The first of these products is Microsoft Azure Sentinel , which is touted as the first native Security Information and Event Management (SIEM) tool built by a major cloud provider.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Monitoring For the uninitiated, SIEM gives companies real-time insights into all activities across their internal systems, providing monitoring and alerts for potential threats. But with the growth of cloud computing and the increasing sophistication of cyberattacks, Microsoft argues that traditional SIEM tools are simply not up to the task. With Azure Sentinel, Microsoft wants its customers to know that it has their backs.
“Too many enterprises still rely on traditional Security Information and Event Management tools that are unable to keep pace with the needs of defenders, volume of data, or the agility of adversaries,” Johnson added. “The cloud enables a new class of intelligent security technologies that reduce complexity and integrate with the platforms and productivity tools you depend on.” Azure Sentinel is about offering companies automated protection and reducing “alert fatigue” by cutting down on false alarms. It enables users to connect data from all of their various sources — across devices, servers, applications, and users — and works in any on-premises or cloud environment.
“Because it’s built on Azure, you can take advantage of nearly limitless cloud speed and scale and invest your time in security and not servers,” Johnson continued.
Above: Azure Sentinel According to Johnson, Microsoft worked closely with a number of its Azure customers to build Sentinel “from the ground up.” At its core, it’s about helping security operations teams focus on more complex security issues, rather than getting bogged down chasing every alert, many of which are false flags generated by legitimate events.
“Early adopters are finding that Azure Sentinel reduces threat-hunting from hours to seconds,” Johnson noted.
The human touch While Azure Sentinel opens in preview today through the Azure portal, Microsoft is also announcing a second new security offering it calls Threat Experts. For this service, Microsoft is offering its own in-house security experts as part of Windows Defender Advanced Threat Protection (ATP) — its unified enterprise security service for preventative, post-breach, and automated investigations.
In a nutshell, Threat Experts will serve as an extension to companies’ own in-house security personnel, providing additional manpower to “proactively hunt” through security data to identify intrusions and other advanced attacks.
“Our approach to security is not only about applying the cloud and AI to your scale challenges, but also making the security operations experts who defend our cloud available to you,” added Johnson.
As part of this offering, users will see an “Ask a Threat Expert” button that lets security teams submit questions directly through the Windows Defender ATP console. This service is available now as a public preview through the settings in Windows Defender ATP.
At its last earnings, Microsoft reported Azure revenue growth of 76 percent, and some analysts predict that Azure will grow 72 percent in 2019. It’s estimated that this will represent roughly 10 percent of Microsoft’s total business. But as Microsoft goes all-in on the cloud, it is faced with the task of convincing new — and existing — customers to use Azure over competitors such as Amazon’s AWS, which is currently the market leader. Central to that mission is security.
If Microsoft can convince companies that their data is protected, it stands a far greater of chance of winning in the long-term.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,674 | 2,019 |
"Microsoft announces security, identity, management, and compliance updates across Azure and Office | VentureBeat"
|
"https://venturebeat.com/2019/11/04/microsoft-announces-security-identity-management-and-compliance-updates-across-azure-and-office"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft announces security, identity, management, and compliance updates across Azure and Office Share on Facebook Share on X Share on LinkedIn The Microsoft Azure logo.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Today marked the start of Microsoft’s Ignite conference in Orlando, and the tech giant wasted no time announcing new security, compliance, governance, management, and identity solutions across its sprawling Azure and Microsoft 365 ecosystems. Without further ado, here’s what you need to know.
Security Application Guard in Office 365 Starting in limited preview ahead of a rollout in 2020, Application Guard — the security tool built into Microsoft Edge — will be integrated with the ProPlus edition of Office 365. It will enable users to open, print, edit, and save untrusted Word, Excel, and PowerPoint files within a virtualized container protected with “hardware-level security” and to check documents against a cloud-hosted security service (Microsoft’s Defender Advanced Threat Protection) before migrating those files from the container. New containers are created at login, so as to provide a clean start.
Azure Sentinel Azure Sentinel , Microsoft’s cloud-based security information and event management (SIEM) service, has new built-in hunting queries for Linux and network events. Plus, users can now launch programming notebooks directly from it and tap revamped analytics and investigation tools for insights into suspicious URLs, or leverage new built-in connectors from security partners that collect endpoint, network, and identity data across different sources.
There’s also new Graph Security API integrations that sync alerts from Azure Sentinel, as well as additional third-party ticketing and security management solutions from Zscaler, Barracuda, and Citrix.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Endpoint detection and response for Mac and Safe Documents Following a limited preview earlier this year, Microsoft Defender’s endpoint detection and response capabilities will be available for Mac users in private preview, starting this December. Microsoft says it’s planning to add support for Linux servers.
In related news, Safe Documents, which scans files for malicious attachments and links, will become generally available in Office 365 mid-December. It engages when users exit Protected View, the feature in Word, Excel, and PowerPoint that opens files in a read-only mode and disables editing functions.
Office 365 Advanced Threat Protection Tangential to this is Automated Incident Response, which launched in general availability earlier this year. It facilitates the detection and investigation of and response to security alerts, complementing the enhanced compromise protection feature (in public preview) that uses email patterns and other activities to detect suspicious users and alert security teams. Playbooks automatically investigate the alerts, look for possible sources of compromise, assess impact, and recommend remediation actions.
Microsoft Secure Score Microsoft Secure Score, a self-assessment tool that ingests signals across Office 365 (such as where users are defined and settings are stored) to generate a metric of preparedness for security breaches or hacks, will soon gain a “significantly” updated scoring system intended to make it easier to understand, benchmark, and track progress in improving security postures. Other updates include new planning capabilities that will let users set goals and predict score improvements, as well as a new report type for showing progress and integrations with Microsoft Teams, Microsoft Planner, ServiceNow, and Azure Security Center.
The collaboration capabilities are available now, with other features to roll out by early 2020.
Azure Security Center Azure Security Center , which provides unified security management and threat protection across hybrid cloud environments, now supports custom security policies in preview and alert exporting to third-party tools (or Azure Data Explorer). On a related note, a Quick Fix feature that automatically fixes misconfigurations on multiple containers is now generally available.
Beginning today, Azure Security Center users can create policies with Azure Logic Apps — Microsoft’s cloud service that helps automate and orchestrate tasks and business processes — that trigger automatically based on specific findings, such as suggestions or alerts. The playbooks can be configured to perform virtually any custom action, or simply the actions defined in templates provided by Security Center.
Security improvements for SQL databases running on virtual machines are coming down the pipeline, Microsoft says, starting with vulnerability assessment. Much like Advanced Threat Protection , which detects anomalous activities indicating potentially harmful attempts to access an SQL server, the assessment discovers, tracks, and helps remediate database vulnerabilities.
On the subject of virtual machines, the standard tier of Security Center now includes a built-in vulnerability assessment for virtual machines powered by Qualys for no additional fee. It continuously scans installed apps to uncover potential exploits and flaws, which it spotlights in the Security Center portal.
With respect to the Azure Kubernetes Service , Microsoft’s managed Kubernetes offering, three new components are available in preview starting today: continuous discovery of managed AKS instances, security recommendations, and host and cluster-based threat detection. Separately, it was revealed that support for vulnerability assessment is expanding to Azure Container Registry, the product that allows users to build, store, and manage images for container deployments.
Azure Firewall Azure Firewall, Microsoft’s firewall-as-a-service offering that enables customers to govern and log traffic flows, has a new capability in Azure Firewall Manager. It’s a unified dashboard from which managers can configure multiple Azure Firewall instances and automate deployment or enforce policies.
Identity Azure Active Directory Azure Active Directory , the enterprise identity service that provides single sign-on and multi-factor authentication, today received a fresh coat of paint. Specifically, the MyApps portal now offers a “mobile-first” launching experience (in preview) for enterprise apps and a unified app experience across the Office.com portal, Office 365 search, and Office navigation, plus workspaces for administrator-curated apps.
As of today, more customers with any Azure Active Directory plan can use the Microsoft Authenticator app to securely access their apps without a password. (Previously, only customers with a paid plan could use the app for passwordless authentication.) It’s not yet generally available — though anticipated to be in 2020 — but it has expanded from the public preview that kicked off several months ago.
New identity features in Microsoft 365 are on the way and already in private preview for some customers. One of those is SMS sign-in, which allows users to sign in with their phone number and an SMS code for authentication. On the other hand, global sign-out — which rolls out later this year for Android devices — will enable workers to sign out of all apps with a single click. Delegated user management will let admins manage users and credentials, and a new off-shift access feature in Teams will enable companies to grant app access to workers while complying with designated work hours.
In other news, Azure Active Directory Connect cloud provisioning launched in preview this morning. It’s intended to help customers consolidate on-premise Active Directory forests and multiple deployments, with a lightweight agent that moves sync and data transformation logic to the Azure cloud.
Compliance A new Microsoft service offering dubbed Insider Risk Management targets employees who violate company policies around intellectual property or breach confidentiality. Machine learning algorithms take into account variables like file activity, communications sentiment, and abnormal user behaviors to identify patterns and risks in a privacy-preserving fashion (names are anonymized). The algorithms also launch playbooks and workflows for scenarios like digital IP theft, confidentiality breaches, and potential security violations that rope in the appropriate security, HR, legal, and compliance teams to investigate and take action.
A new metric for compliance — Compliance Score — is now available in public preview for Office 365 customers. It provides a measure meant to communicate an organization’s overall compliance or noncompliance with applicable rules, laws, and regulations. Included among assessments is one for the California Consumer Privacy Act.
Also launching today is Communication Compliance, a solution that helps organizations address code-of-conduct policy violations in communications and assists those companies in meeting supervisory requirements in regulated industries. It leverages machine learning to intelligently detect violations across different communication channels, such as Microsoft Teams, Exchange Online, or Bloomberg instant messages, and it offers features like historical user context on past violations, conversation threading, and keyword highlighting that allows investigators to triage violations and take appropriate remediation actions.
Where the Regulatory Compliance dashboard is concerned — that is, the cloud-hosted dashboard that provides insights into compliance based on Security Center assessments — Microsoft says it has added additional standards, including NIST SP 800-53 R4, SWIFT CSP CSCF v2020, Canada Federal PBMM, and UK Official, together with UK NHS. Additionally, users can now select which standards to onboard and track through Azure Policy.
Management Azure Monitor Azure Monitor , the Azure service that collates virtual network alerts, metrics, logs, and more in a single view, is gaining two enhancements aimed at providing greater visibility. From a console, Network Insights delivers health information and other data across cloud resources that can be quickly viewed, while Traffic Analytics — an existing solution, but one that now processes data faster than before (at 10-minute intervals) — delivers auditing support for network activity.
Azure Monitor for containers, which tracks the performance of container workloads deployed to either Azure Container instances or managed Kubernetes clusters hosted on Azure Kubernetes Service, now offers monitoring for customers who run a hybrid Kubernetes deployment with on-premises and Azure infrastructure (in preview) and metric- and log-scraping for the event-monitoring and alerting tool Prometheus (in general availability). And new no-code capabilities have made their way into Azure Monitor, including one that supports monitoring of .NET apps running on virtual machines and Application Insights, an agent that monitors IIS and .NET processes and collects telemetry for debugging.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,675 | 2,020 |
"Microsoft 365 bundles Office 365 with AI and cloud-powered features | VentureBeat"
|
"https://venturebeat.com/2020/03/30/microsoft-365-bundles-office-365-with-ai-and-cloud-powered-features"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft 365 bundles Office 365 with AI and cloud-powered features Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Microsoft today announced that on April 21, Office 365 will become Microsoft 365, or “the subscription for your life.” The new suite builds on Office 365 with “new AI, rich content and templates, and cloud-powered experiences.” On April 21, Microsoft 365 Personal and Microsoft 365 Family subscriptions (up to six people) will replace Office 365 Personal and Office 365 Home. The pricing will remain the same: $7 per month and $10 per month, respectively. The Microsoft 365 plans include everything already in Office 365, including desktop Office apps, 1TB of OneDrive cloud storage per person, 60 Skype minutes, security features, technical support, plus the new Office features also announced today. Those features are spread across Word, Excel, PowerPoint, Outlook, and Microsoft Teams.
(Even Edge is getting new consumer features , though the browser doesn’t require a Microsoft 365 subscription, of course.) If Microsoft 365 sounds familiar, that’s because Microsoft 365 debuted for businesses back in July 2017.
The more expensive Microsoft 365 Enterprise and Microsoft 365 Business subscriptions include business-specific security and management functionality. Half a billion people use the free Office applications and services (Word, Excel, PowerPoint, Skype, Outlook, OneNote, and OneDrive) for Windows, macOS, iOS, Android, and the web. Microsoft shared today that more than 38 million consumers subscribe to Office 365 (up from 37.2 million in January ). By comparison, there are over 200 million monthly active Office 365 business users.
“365” branding nonsense aside, Microsoft started rolling out new Office features today (reaching its 38 million subscribers “over the next few months”). They are supposed to help you “become a better writer, presenter, designer, manager of your finances, and deepen your connection to the people in your life.” Microsoft also unveiled a Microsoft Family Safety app and Microsoft Teams for consumers arriving “in the coming months.” But first, let’s talk about the Office features.
Word and Excel Microsoft 365 subscribers are getting over 8,000 images and 175 looping videos from Getty Images, plus 300 new fonts and 2,800 new icons to use in Word and Excel. They will also get over 200 new premium templates for Word, Excel, and PowerPoint. The templates include resumes, wedding invitations, newsletters, and birth announcements, as well as coloring books and reward charts for kids.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! While Word itself isn’t getting new features, the AI-powered Microsoft Editor is now accessible across Word and Outlook.com, plus as a Google Chrome and Microsoft Edge extension. Anyone can use Editor’s basic capabilities, but Microsoft 365 Personal and Family subscribers will have access to advanced grammar and style refinements. Read Khari Johnson’s take on the Microsoft Editor news here.
Excel is getting a bunch of new features available to Office Insiders this spring. Microsoft 365 subscribers will get them “in the coming months,” starting in the U.S.
Money in Excel will help you manage, track, and analyze your spending. The feature connects to your bank and credit card accounts, imports transactions and account balances, and provides personalized insights. For example, Money in Excel can tell you how much you’re spending on categories like groceries each month. It can also alert you about price changes for recurring payments, bank fees, overdraft limits, and so on.
Excel is also getting new data types for over 100 topics powered by Wolfram Alpha. These include food, movies, places, chemistry, and even Pokémon.
If you convert plain text and numbers into a data type, Excel will surface visual and interactive data cards. For example, if you convert “avocado” to a Food data type, Excel will show its nutritional information. Or, if you’re considering adopting a dog, you can compare different breeds using the Animal data type, which provides images, facts, and their temperaments. Other examples Microsoft gave include moving to a new city or helping your kids learn chemistry.
PowerPoint, Outlook, and Microsoft Family Safety PowerPoint is getting two new AI-powered features. Presenter Coach will be able to rate whether you have a monotone pitch and to offer speech refinement suggestions. For the former, Presenter Coach will listen to your tone of voice and give suggestions in real time on where to add some variation. For the latter, Presenter Coach will give grammar suggestions, including how to better phrase your speech. These will be available in preview for free at first, but “eventually” they will only be for Microsoft 365 subscribers.
Outlook is getting new functionality to manage your work and life commitments. Outlook on the web will let you link your personal calendar to your work calendar. That way you can share your real availability to work colleagues while maintaining privacy around the details of personal appointments and business meetings. Next up, Play My Emails, where Cortana provides an intelligent read-out of your emails on iOS , will soon use Microsoft Search.
You’ll thus be able to use natural language to speak or type. The new search functionality and Play My Email on Android will start rolling out “in the coming months.” Lastly, Microsoft unveiled Microsoft Family Safety, a new Android and iOS app for Microsoft 365 subscribers. The app manages screen time across Windows PCs, Android, and Xbox. It also offers location sharing and notifications when a family member arrives or departs a location like home, school, or work. It even offers driving reports to help build better habits behind the wheel. The main purpose, however, is to protect your kids as they explore and play games. It shows you how they are spending their time, lets you set limits, and steers them away from content that you feel is not age appropriate. Microsoft said kids will be able to opt out of the app and dispute their parents following their movements. A limited preview of the app will be available “in the coming months.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,676 | 2,021 |
"How AI is helping Nvidia improve U.S. Postal Service delivery | VentureBeat"
|
"https://venturebeat.com/2021/05/06/how-ai-is-helping-nvidia-improve-u-s-postal-service-delivery"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How AI is helping Nvidia improve U.S. Postal Service delivery Share on Facebook Share on X Share on LinkedIn Postal Service employees perform spot checks to ensure packages are properly handled and sorted.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Nvidia this week detailed a partnership with the U.S. Postal Service to transform the latter’s mail operations with AI. According to Nvidia, its machine learning resources and tools enabled the Postal Service to process over 20 terabytes of images a day from more than 1,000 mail processing machines using 195 edge servers.
In 2019, the Postal Service had a requirement to identify and track items in its over 100 million pieces of daily mail. Data scientists at the organization thought they could expand an image analysis system developed internally into something broader, with edge AI servers strategically located at the Post Office’s processing centers. The hope was that the system would enable the Postal Service to analyze billions of images of mail and share the insights quickly over the network.
Recruiting half a dozen architects at Nvidia and other companies, the Postal Service arrived at the deep learning models it needed after a three-week sprint. The work was the genesis of the Edge Compute Infrastructure Program, a distributed edge AI system that’s running on the NVIDIA EGX platform at the Postal Service today.
Open source software from Nvidia, the Triton Inference Server, acts as a sort of digital mailperson between the edge servers, delivering the necessary AI models on demand. According to the Postal Service’s analysis, a computer vision task that would have required two weeks on a network of servers with 800 processors can now be accomplished in 20 minutes on the four NVIDIA V100 Tensor Core GPUs in one of the edge servers, HPE Apollo 6500s.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Model serving Triton automates the delivery of AI models to different Postal Service systems that may have unique configurations of GPUs and CPUs supporting deep learning frameworks. An app that checks for mail items alone requires coordinating the work of more than half a dozen deep learning models, each checking for specific features.
Above: Cameras mounted on the sorting machines capture addresses, barcodes, and other data, such as hazardous materials symbols.
Departments across the Postal Service, from enterprise analytics to finance and marketing, have spawned ideas for as many as 30 apps for ECIP, Nvidia says. One would determine if a package carries the right postage for its size, weight, and destination. Another would decipher a barcode, even in the presence of damage.
The plan is to get several new AI-powered apps up and running this year. Nvidia and the Postal Service say the barcode model could be on ECIP as soon as this summer.
Next-gen OCR Seeking further improvements to its mail processing pipeline, the Postal Service put out a request for what could be the next app for ECIP — one that uses optical character recognition (OCR). In the past, the agency would have bought expensive new hardware and software or used a public cloud service, which takes a lot of bandwidth and has significant costs. This time, leaning on Nvidia expertise, the company deployed an AI-based OCR system in a container on ECIP managed by Kubernetes and served by Triton.
In the early weeks of the pandemic, operators rolled out containers to get the first systems running as others were being delivered, updating them as the full network was installed. Nvidia was awarded the contract in September 2019, started deploying systems last February, and finished most of the hardware by August.
Above: AI algorithms were developed on NVIDIA DGX servers at a U.S. Postal Service Engineering facility.
The new solutions could help the Postal Service improve delivery standards, which have fallen over the past year. In mid-December, during the last holiday season, the agency delivered as little as 62% of first-class mail on time — the lowest level in years. The rate rebounded to 84% by the week of March 6 but remained below the agency’s target of about 96%.
The Postal Service has blamed the pandemic and record peak periods for much of the poor service performance.
“The models we have deployed so far help manage the mail and the Postal Service — it helps us maintain our mission,” Todd Schimmel — the manager who oversees Postal Service systems, including ECIP — said in a press release. “It used to take eight or 10 people several days to track down items, now it takes one or two people a couple of hours. This has a benefit for us and our customers, letting us know where a specific parcel is at — it’s not a silver bullet, but it will fill a gap and boost our performance … We’re at the very beginning of our journey with edge AI.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,677 | 2,021 |
"Nvidia, NERSC claim Perlmutter is world's fastest AI supercomputer | VentureBeat"
|
"https://venturebeat.com/2021/05/27/nvidia-nersc-claim-perlmutter-is-worlds-fastest-ai-supercomputer"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Nvidia, NERSC claim Perlmutter is world’s fastest AI supercomputer Share on Facebook Share on X Share on LinkedIn Perlmutter, the largest NVIDIA A100-powered system in the world.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Nvidia and the National Energy Research Scientific Computing Center (NERSC) on Thursday flipped the “on” switch for Perlmutter, billed as the world’s fastest supercomputer for AI workloads.
Named for astrophysicist Saul Perlmutter, the new supercomputer boasts 6,144 NVIDIA A100 Tensor Core GPUs and will be tasked with stitching together the largest ever 3D map of the visible universe, among other projects.
Perlmutter is “the fastest system on the planet” at processing workloads with the 16-bit and 32-bit mixed-precision math used in artificial intelligence (AI) applications, said Nvidia global HPC/AI product marketing lead Dion Harris during a press briefing earlier this week. Later this year, a second phase will add even more AI supercomputing power to Perlmutter, which is housed at NERSC at the Lawrence Berkeley National Laboratory.
“In one project, the supercomputer will help assemble the largest 3D map of the visible universe to date. It will process data from the Dark Energy Spectroscopic Instrument (DESI), a kind of cosmic camera that can capture as many as 5,000 galaxies in a single exposure,” Harris wrote in a blog post announcing the news.
“Researchers need the speed of Perlmutter’s GPUs to capture dozens of exposures from one night to know where to point DESI the next night. Preparing a year’s worth of the data for publication would take weeks or months on prior systems, but Perlmutter should help them accomplish the task in as little as a few days,” he wrote.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Supercharging HPC with AI and machine learning Firing up an AI-optimized supercomputer “represents a very real milestone,” said Wahid Bhimji, acting lead for NERSC’s data and analytics services group.
“AI for science is a growth area at the U.S. Department of Energy, where proof of concepts are moving into production use cases in areas like particle physics, materials science, and bioenergy,” he said.
“People are exploring larger and larger neural-network models and there’s a demand for access to more powerful resources, so Perlmutter with its A100 GPUs , all-flash file system, and streaming data capabilities is well timed to meet this need for AI,” Bhimji added.
Perlmutter will give NERSC’s approximately 7,000 supported researchers access to four exaflops of mixed-precision computing performance for AI-assisted scientific projects.
In addition to the DESI mapping project, researchers are teeing up time with the supercomputer for work in fields like climate science, where Perlmutter will assist in probing subatomic interactions to discover green energy sources.
That project, which will generate simulations of atoms interacting, requires the special blend of AI and high-performance computing (HPC) that Perlmutter delivers, Harris said.
“Traditional supercomputers can barely handle the math required to generate simulations of a few atoms over a few nanoseconds with programs such as Quantum Espresso. But by combining their highly accurate simulations with machine learning, scientists can study more atoms over longer stretches of time,” he said.
The ability to leverage AI in supercomputing also has researchers optimistic about the DESI project. In addition to mapping the known universe, the project “aims to shed light on dark energy, the mysterious physics behind the accelerating expansion of the universe,” NERSC data architect Rollin Thomas said. System namesake Saul Perlmutter, who remains a working astrophysicist at Berkeley Lab, was awarded the 2011 Nobel Prize for Physics for his contributions to the discovery of dark energy.
“To me, Saul is an example of what people can do with the right combination of insatiable curiosity and a commitment to optimism,” Thomas said.
He added that in preparatory work with researchers to get code ready for Perlmutter supercomputer workloads, NERSC was already seeing 20x faster GPU processing performance than in previously available systems.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,678 | 2,017 |
"Nvidia uses AI to create 3D graphics better than human artists can | VentureBeat"
|
"https://venturebeat.com/2017/07/31/nvidia-uses-ai-to-create-3d-graphics-better-than-human-artists-can"
|
"Game Development View All Programming OS and Hosting Platforms Metaverse View All Virtual Environments and Technologies VR Headsets and Gadgets Virtual Reality Games Gaming Hardware View All Chipsets & Processing Units Headsets & Controllers Gaming PCs and Displays Consoles Gaming Business View All Game Publishing Game Monetization Mergers and Acquisitions Games Releases and Special Events Gaming Workplace Latest Games & Reviews View All PC/Console Games Mobile Games Gaming Events Game Culture Nvidia uses AI to create 3D graphics better than human artists can Share on Facebook Share on X Share on LinkedIn Nvidia Zcam demo at Siggraph.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Nvidia spans both gaming graphics and artificial intelligence, and it is showing that with its announcements this week at the Siggraph computer graphics event in Los Angeles.
Those announcements range from providing external graphics processing for content creators to testing AI robotics technology inside a virtual environment known as the Holodeck, named after the virtual reality simulator in the Star Trek series. In fact, Nvidia’s researchers have created a way for AI to create realistic human facial animations in a fraction of the time it takes human artists to do the same thing.
“We are bringing artificial intelligence to computer graphics,” said Greg Estes, vice president of developer marketing at Nvidia, in an interview with GamesBeat. “It’s bringing things full circle. If you look at our history in graphics, we took that into high-performance computing and took that into a dominant position in deep learning and AI. Now we are closing that loop and bringing AI into graphics.” “Our strategy is to lead with research and break new ground,” he said. “Then we take that lead in research and take it into software development kits for developers.” Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Above: Nvidia’s Optix 5.0 can “de-noise” images by removing graininess.
Nvidia has 10 research papers this year at the Siggraph event, Estes said. And some of that will be relevant to Nvidia’s developers, which number about 550,000 now. About half of those developers are in games, while the rest are in high-performance computing, robotics, and AI.” Among the announcements, one is particularly cool. Estes said that Nvidial will show off its Isaac robots in a new environment. These robots, which are being used to vet AI algorithms, will be brought inside the virtual environment that Nvidia calls Project Holodeck.
Project Holodeck is a virtual space for collaboration, where full simulations of things like cars and robots are possible. By putting the Isaac robots inside that world, they can learn how to behave, without causing havoc in the real world.
Above: The Project Holodeck demo “A robot will be able to learn things in VR,” Estes said. “We can train it in a simulated environment.” Nvidia is providing external Titan X or Quadro graphics cards through an external graphics processing unit (eGPU) chassis. That will boost workflows for people who use their laptop computers for video editing, interactive rendering, VR content creation, AI development and more, Estes said.
To ensure professionals can enjoy great performance with applications such as Autodesk Maya and Adobe Premier Pro, Nvidia is releasing a new performance driver for Titan X hardware to make it faster. The Quadro eGPU solutions will be available in September through partners such as Bizon, Sonnet, and One Stop Systems/Magma.
Nvidia also said it was launching its Optix 5.0 SDK on the Nvidia DGX AI workstation. That will give designers, artists, and other content-creation professionals the rendering capability of 150 standard central processing unit (CPU) servers.
The tech could be used by millions of people, Estes said. And that kind of system would cost $75,000 over three years, compared to $4 million for a CPU-based system, the company said.
OptiX 5.0’s new ray tracing capabilities will speed up the process required to visualize designs or characters, thereby increasing a creative professional’s ability to interact with their content. It features new AI “de-noising” capability to accelerate the removal of graininess from images, and brings GPU-accelerated motion blur for realistic animation effects. It will be available for free in November.
By running Nvidia Optix 5.0 on a DGX Station, content creators can significantly accelerate training, inference and rendering (meaning both AI and graphics tasks).
“AI is transforming industries everywhere,” said Steve May, vice president and chief technology officer of Pixar, in a statement. “We’re excited to see how Nvidia’s new AI technologies will improve the filmmaking process.” On the research side, Nvidia is showing how it can animate realistic human faces and simulate how light interacts with surfaces. It will tap AI technology to improve the realism of the facial animations. Right now, it takes human artists hundreds of hours to create digital faces that more closely match the faces of human actors.
Nvidia Research partnered with Remedy Entertainment, maker of games such as Quantum Break, Max Payne and Alan Wake, to help game makers produce more realistic faces with less effort and at lower cost.
Above: Nvidia is using AI to create human facial animations.
The parties combined Remedy’s animation data and Nvidia’s deep learning technology to train a neural network to produce facial animations directly from actor videos. The research was done by Samuli Laine, Tero Karras, Timo Aila, and Jaakko Lehtinen. Nvidia’s solution requires only five minutes of training data to generate all the facial animation needed for an entire game from a simple video stream.
Antti Herva, lead character technical artist at Remedy, said that over time, the new methods will let the studio build larger, richer game worlds with more characters than are now possible.
Already, the studio is creating high-quality facial animation in much less time than in the past.
“Based on the Nvidia research work we’ve seen in AI-driven facial animation, we’re convinced AI will revolutionize content creation,” said Herva, in a statement. “Complex facial animation for digital doubles like that in Quantum Break can take several man-years to create. After working with Nvidia to build video- and audio-driven deep neural networks for facial animation, we can reduce that time by 80 percent in large scale projects and free our artists to focus on other tasks.” In another research project, Nvidia trained a system to generate realistic facial animation using only audio. With this tool, game studios will be able to add more supporting game characters, create live animated avatars, and more easily produce games in multiple languages.
Above: AI can smooth out the “jaggies,” or rough edges in 3D graphics.
AI also holds promise for rendering 3D graphics, the process that turns digital worlds into the life-like images you see on the screen. Film makers and designers use a technique called “ray tracing” to simulate light reflecting from surfaces in the virtual scene. Nvidia is using AI to improve both ray tracing and rasterization, a less costly rendering technique used in computer games.
In a related project, Nvidia researchers used AI to tackle a problem in computer game rendering known as anti-aliasing. Like the de-noising problem, anti-aliasing removes artifacts from partially-computed images, with this artifact looking like stair-stepped “jaggies.” Nvidia researchers Marco Salvi and Anjul Patney trained a neural network to recognize jaggy artifacts and replace those pixels with smooth anti-aliased pixels. The AI-based solution produces images that are sharper (less blurry) than existing algorithms.
Nvidia is also developing more efficient methods to trace virtual light rays. Computers sample the paths of many light rays to generate a photorealistic image. The problem is that not all of those light paths contribute to the final image.
Researchers Ken Daum and Alex Keller trained a neural network to guide the choice of light paths. They accomplished this by connecting the math of tracing light rays to the AI concept of reinforcement learning.
Their solution taught the neural network to distinguish the paths most likely to connect lights with virtual cameras, from the paths that don’t contribute to the image.
Above: Nvidia uses AI to figure out light sources in 3D graphics.
Lastly, Nvidia said it taking immersive VR to more people by releasing the VRWorks 360 Video SDK to enable production houses to livestream high-quality, 360-degree, stereo video to their audiences.
Normally, it takes a lot of computation time to stitch together images for 360-degree videos. By doing live 360-degree stereo stitching, Nvidia is making life a lot easier for the live-production and live-event industries, said Zvi Greenstein, vice president at Nvidia.
The VRWorks SDK enables production studios, camera makers and app developers to integrate 360 degree, stereo stitching SDK into their existing workflow for live and post production. The Z Cam V1 Pro (made by VR camera firm Z Cam) is the first professional 360 degree VR camera that will fully integrate the VRWorks SDK.
“We have clients across a wide range of industries, from travel through sports, who want high quality, 360 degree video,” said Chris Grainger, CEO of Grainger VR, in a statement. “This allows filmmakers to push the boundaries of live storytelling.” GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! Games Beat Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,679 | 2,021 |
"New deep learning model brings image segmentation to edge devices | VentureBeat"
|
"https://venturebeat.com/2021/05/14/new-deep-learning-model-brings-image-segmentation-to-edge-devices"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages New deep learning model brings image segmentation to edge devices Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
A new neural network architecture designed by artificial intelligence researchers at DarwinAI and the University of Waterloo will make it possible to perform image segmentation on computing devices with low-power and -compute capacity.
Segmentation is the process of determining the boundaries and areas of objects in images. We humans perform segmentation without conscious effort, but it remains a key challenge for machine learning systems. It is vital to the functionality of mobile robots, self-driving cars, and other artificial intelligence systems that must interact and navigate the real world.
Until recently, segmentation required large, compute-intensive neural networks. This made it difficult to run these deep learning models without a connection to cloud servers.
In their latest work, the scientists at DarwinAI and the University of Waterloo have managed to create a neural network that provides near-optimal segmentation and is small enough to fit on resource-constrained devices. Called AttendSeg, the neural network is detailed in a paper that has been accepted at this year’s Conference on Computer Vision and Pattern Recognition (CVPR).
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Object classification, detection, and segmentation One of the key reasons for the growing interest in machine learning systems is the problems they can solve in computer vision.
Some of the most common applications of machine learning in computer vision include image classification, object detection, and segmentation.
Image classification determines whether a certain type of object is present in an image or not. Object detection takes image classification one step further and provides the bounding box where detected objects are located.
Segmentation comes in two flavors: semantic segmentation and instance segmentation. Semantic segmentation specifies the object class of each pixel in an input image. Instance segmentation separates individual instances of each type of object. For practical purposes, the output of segmentation networks is usually presented by coloring pixels. Segmentation is by far the most complicated type of classification task.
Above: Image classification vs. object detection vs. semantic segmentation (credit: codebasics ).
The complexity of convolutional neural networks (CNN), the deep learning architecture commonly used in computer vision tasks, is usually measured in the number of parameters they have. The more parameters a neural network has the larger memory and computational power it will require.
RefineNet, a popular semantic segmentation neural network, contains more than 85 million parameters. At 4 bytes per parameter, it means that an application using RefineNet requires at least 340 megabytes of memory just to run the neural network. And given that the performance of neural networks is largely dependent on hardware that can perform fast matrix multiplications, it means that the model must be loaded on the graphics card or some other parallel computing unit, where memory is more scarce than the computer’s RAM.
Machine learning for edge devices Due to their hardware requirements, most applications of image segmentation need an internet connection to send images to a cloud server that can run large deep learning models. The cloud connection can pose additional limits to where image segmentation can be used. For instance, if a drone or robot will be operating in environments where there’s no internet connection, then performing image segmentation will become a challenging task. In other domains, AI agents will be working in sensitive environments and sending images to the cloud will be subject to privacy and security constraints. The lag caused by the roundtrip to the cloud can be prohibitive in applications that require real-time response from the machine learning models. And it is worth noting that network hardware itself consumes a lot of power, and sending a constant stream of images to the cloud can be taxing for battery-powered devices.
For all these reasons (and a few more), edge AI and tiny machine learning (TinyML) have become hot areas of interest and research both in academia and in the applied AI sector.
The goal of TinyML is to create machine learning models that can run on memory- and power-constrained devices without the need for a connection to the cloud.
Above: The architecture of AttendSeg on-device semantic segmentation neural network.
With AttendSeg, the researchers at DarwinAI and the University of Waterloo tried to address the challenges of on-device semantic segmentation.
“The idea for AttendSeg was driven by both our desire to advance the field of TinyML and market needs that we have seen as DarwinAI,” Alexander Wong, co-founder at DarwinAI and Associate Professor at the University of Waterloo, told TechTalks.
“There are numerous industrial applications for highly efficient edge-ready segmentation approaches, and that’s the kind of feedback along with market needs that I see that drives such research.” The paper describes AttendSeg as “a low-precision, highly compact deep semantic segmentation network tailored for TinyML applications.” The AttendSeg deep learning model performs semantic segmentation at an accuracy that is almost on-par with RefineNet while cutting down the number of parameters to 1.19 million. Interestingly, the researchers also found that lowering the precision of the parameters from 32 bits (4 bytes) to 8 bits (1 byte) did not result in a significant performance penalty while enabling them to shrink the memory footprint of AttendSeg by a factor of four. The model requires little above one megabyte of memory, which is small enough to fit on most edge devices.
“[8-bit parameters] do not pose a limit in terms of generalizability of the network based on our experiments, and illustrate that low precision representation can be quite beneficial in such cases (you only have to use as much precision as needed),” Wong said.
Above: Experiments show AttendSeg provides optimal semantic segmentation while cutting down the number of parameters and memory footprint.
Attention condensers for computer vision AttendSeg leverages “attention condensers” to reduce model size without compromising performance. Self-attention mechanisms are a series that improve the efficiency of neural networks by focusing on information that matters. Self-attention techniques have been a boon to the field of natural language processing.
They have been a defining factor in the success of deep learning architectures such as Transformers. While previous architectures such as recurrent neural networks had a limited capacity on long sequences of data, Transformers used self-attention mechanisms to expand their range. Deep learning models such as GPT-3 leverage Transformers and self-attention to churn out long strings of text that ( at least superficially ) maintain coherence over long spans.
AI researchers have also leveraged attention mechanisms to improve the performance of convolutional neural networks. Last year, Wong and his colleagues introduced attention condensers as a very resource-efficient attention mechanism and applied them to image classifier machine learning models.
“[Attention condensers] allow for very compact deep neural network architectures that can still achieve high performance, making them very well suited for edge/TinyML applications,” Wong said.
Above: Attention condensers improve the performance of convolutional neural networks in a memory-efficient way.
Machine-driven design of neural networks One of the key challenges of designing TinyML neural networks is finding the best performing architecture while also adhering to the computational budget of the target device.
To address this challenge, the researchers used “ generative synthesis ,” a machine learning technique that creates neural network architectures based on specified goals and constraints. Basically, instead of manually fiddling with all kinds of configurations and architectures, the researchers provide a problem space to the machine learning model and let it discover the best combination.
“The machine-driven design process leveraged here (Generative Synthesis) requires the human to provide an initial design prototype and human-specified desired operational requirements (e.g., size, accuracy, etc.) and the MD design process takes over in learning from it and generating the optimal architecture design tailored around the operational requirements and task and data at hand,” Wong said.
For their experiments, the researchers used machine-driven design to tune AttendSeg for Nvidia Jetson, hardware kits for robotics and edge AI applications. But AttendSeg is not limited to Jetson.
“Essentially, the AttendSeg neural network will run fast on most edge hardware compared to previously proposed networks in literature,” Wong said. “However, if you want to generate an AttendSeg that is even more tailored for a particular piece of hardware, the machine-driven design exploration approach can be used to create a new highly customized network for it.” AttendSeg has obvious applications for autonomous drones, robots, and vehicles, where semantic segmentation is a key requirement for navigation. But on-device segmentation can have many more applications.
“This type of highly compact, highly efficient segmentation neural network can be used for a wide variety of things, ranging from manufacturing applications (e.g., parts inspection / quality assessment, robotic control) medical applications (e.g., cell analysis, tumor segmentation), satellite remote sensing applications (e.g., land cover segmentation), and mobile application (e.g., human segmentation for augmented reality),” Wong said.
Ben Dickson is a software engineer and the founder of TechTalks. He writes about technology, business, and politics.
This story originally appeared on Bdtechtalks.com.
Copyright 2021 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,680 | 2,018 |
"Chatbots are overhauling how brands and customers interact (VB Live) | VentureBeat"
|
"https://venturebeat.com/2018/06/20/chatbots-are-overhauling-how-brands-and-customers-interact-vb-live"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Live Chatbots are overhauling how brands and customers interact (VB Live) Share on Facebook Share on X Share on LinkedIn Chatbots are changing the way companies and customers interact, helping to create powerful, engaging human-to-AI agent experiences. Join execs from Lyft, Western Union and Kaleido Insights when you catch up on this VB Live event and learn how leading brands are using chatbots — you’ll leave with actionable ideas on implementing AI-powered customer support.
Access on demand for free right here.
Chatbots are the most accessible way of leveraging the artificial intelligence explosion. Leading brands like Lyft and Western Union are leading the way in chatbot innovation, using them to resolve issues and make customer experiences more rapid and effective.
And shipments of virtual digital assistant systems are beginning to touch every industry, from the banking and financial services space, to travel, retail, education, and more. It’s estimated to grow into a $7.7 billion market by 2025 , says Jessica Groopman, industry analyst and founding partner at Kaleido Insights.
But as companies launch headfirst into building out these systems, they can run up against a number of considerations, particularly around balancing anticipation of customer needs versus the complexity of the bot’s ability to converse.
“We’ve found that it’s really important to understand where the customer is coming from,” says Jaime Gilliam-Swartz, Senior Director, Voice of the Customer and Shared Services at Lyft.
That means feeding all the data garnered in their passenger interactions, including where they are in their ride experience, any reservation issues, any driver issues, and more into the chatbot’s algorithms to continue to refine and optimize it, improving the chatbot’s ability to understand and resolve queries. That data informs the bot’s questions for the passenger and predicts their needs in order to resolve any queries quickly.
At Western Union, the bot is integrated into the company’s service functions, along with the ability to go into their digital send money flow.
“It’s all about learning from the customer interactions around those service functions and making them better as time goes on, and expanding into new service questions,” says Stanley Yung, Western Union’s chief customer experience officer.
It almost sounds like small potatoes — but Groopman sees these kinds of chatbot functionalities as foundational.
“We often see companies who have great ambitions to apply AI across every possible customer touch point and end up reeling things back to a very narrow, specific demographic or line of questioning once the rubber meets the road,” Groopman explains.
But they’re also seeing AI use cases and applications starting to proliferate among three main axes: vision-based, or spatial awareness; language-based, including voice and text; and analytics, or cognitive-based. In customer experience or customer support contexts, that means a whole host of things: image recognition, sentiment analysis, understanding emotions or responses of individual customers as they move through a retail environment, resolution analysis, the kinds of actions or triage or content most efficiently able to resolve a customer issue, behavioral analysis, image analysis, object recognition, and trend prediction.
Visual commerce has been an especially interesting inroad with chatbots, Groopman says, where a customer’s clicks or interactions with specific images help to train or inform the next steps in that customer support or customer experience or even sales interaction. And the possibilities are growing.
“At Western Union, the chatbot we have today is pretty limited in scope,” Yung says. “But as we think about opportunities going forward, scalability and cost are big motivators, and the customer experience benefits are an additional layer that is very intriguing to us at this point.” Consistency across channels is a big deal for the the company, from POS locations and digital interfaces to call centers and chat. AI is unlocking the ability to give all customers uniform and reliable service, in the most actionable and useful way possible for our customers.
They’re also looking to customization potential, particularly around language and different dialects, as well as the speed of the interaction, as well as super-targeted cross-selling and upselling, based upon what they know from their customer database, as well as what they can learn from the conversation.
“The ability to learn from the interaction and tailor it is going to be a big deal for us,” Yung says.
Gilliam-Swartz agrees.
“We are finding that [Lyft] passengers really value personalization and attentiveness,” she says. And it’s unlocked a whole new way to engage with their customers.
The presumption was, if the app allowed a customer to tell them something about the ride, they could potentially anticipate an issue and resolve it within an average of 65 seconds. But they found that even when the issue was resolved, many passengers would say something along the lines of thank you for helping me, I appreciate the refund, but I actually really want to tell you what happened, in a longer discourse. The company actually had to add the capability within the interactive help to collect all that rich feedback.
“It shows why decision trees just won’t work by themselves,” explains Gilliam-Swartz. “It is important to be able to interact and look at some level of natural language processing so that we can interpret what the customer is telling us and then take the right action and confirm to them that we will take it based on what we’ve helped them with.” To learn more about how chatbots can offer rich data mines, bigger and better customer experiences, cost savings from economies of scale and more, don’t miss watching this VB Live event.
Don’t miss out! Access on demand for free here.
Attend this webinar to learn : The real truth about AI and customer service How big brands are using AI and chatbots to resolve customer issues more quickly and effectively How bots and humans can work together to optimize customer support How to assess whether your organization is bot-ready Speakers: Stanley Yung , Chief Customer Experience Officer, Western Union Jaime Gilliam-Swartz , Senior Director, Voice of the Customer and Shared Services, Lyft Jessica Groopman , Industry Analyst & Founding Partner, Kaleido Insights Rachael Brownell , Moderator, VentureBeat Sponsored by Chatkit The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,681 | 2,021 |
"AI call center automation company Asapp raises $120M | VentureBeat"
|
"https://venturebeat.com/2021/05/19/ai-call-center-automation-company-asapp-raises-120m"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI call center automation company Asapp raises $120M Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Asapp , an AI research-driven customer experience company, today announced it has raised $120 million at a $1.6 billion valuation, double its previous post-money valuation. Asapp says it plans to use the capital to support expansion after a year in which it grew roughly twofold.
With customer representatives increasingly required to work from home in Manila , the U.S., and elsewhere, companies are turning to AI to bridge resulting gaps in service. The solutions aren’t perfect — humans are needed even when chatbots are deployed — but COVID-19 has accelerated the need for AI-powered contact center messaging. A Deloitte study found that 56% of businesses in the multimedia and tech sectors have plans to invest in contact center AI technology in the near future. And according to McKinsey, 29% of customer service agent duties have the potential to be automated with technology.
Founded in 2014, Asapp, which claims to have the largest collection of Ph.D. students studying machine learning-powered customer experiences, applies what it calls “self-learning” techniques to solve challenges without rules programming, working to identify opportunities to add to micro-automations that might help call center agents. The company focuses on improving enterprise performance with advances in AI that augment human activity and automate workflows.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! One of the ways Asapp aims to increase agent efficiency is with AI-driven suggestions of what to say and do to resolve issues quickly, even as agents juggle multiple conversations. Machine learning identifies opportunities for automation before, during, and after agent engagement, as well as flexibly setting agent capacity based on intent, complexity of conversations, customer responsiveness, agent experience, and more.
In sales scenarios, Asapp prioritizes prime prospects — those who are most likely to buy particular products or services. Machine learning models predict who these prospects are and place them highest in the queue, providing agents with AI-generated suggestions for what to say and do. Asapp also gives managers real-time performance visibility across queues, helping identify agents that need help. And it identifies opportunities to streamline agent efforts with journey mapping, which tracks what agents are doing.
Asapp provides AI-predicted summary notes that highlight details like addresses. In addition, it applies AI that understands customer intent, with real-time sentiment analysis and machine learning prediction of satisfaction scores.
Benefits of call center automation By some estimates, businesses can reduce their customer service spending 30% by introducing AI-powered chatbots and virtual agents. An analyst report from Juniper Research found that by 2022, businesses are expected to save a whopping $8 billion in customer support costs — a jump from the $20 million saved through automation in 2017.
There’s no shortage of competition in the AI-driven call center analytics space.
Gong offers an intelligence platform for enterprise sales teams and recently nabbed $200 million in funding at a $2.2 billion valuation.
Observe.ai snagged $26 million in December for AI that monitors and coaches call center agents. AI call center startups Cogito and CallMiner have also staked claims alongside more established players like Amazon, Microsoft, and Google.
But Asapp is one of the few companies advancing research and development in AI and its application for customer experience, founder and CEO Gustavo Sapoznik asserts. “In an environment where customer expectations are rising, Asapp is helping large enterprises advance digital engagement, real-time voice transcription, speech analytics, live agent coaching, and analytics,” he added. “The customer experience industry is at a crossroads. After years of interactive voice response systems and bot investments, customer satisfaction is down and costs have increased. We apply our AI research to make people in contact centers wildly more productive because existing rules-based technology and architectures limit companies to small improvements that can’t bridge the digital transformation opportunity that AI is enabling and delivering.” Fidelity and Dragoneer participated in Asapp’s round, with participation from existing investors John Doerr, March Capital, Emergence Capital, Euclidean Capital, HOF Capital, Telstra Ventures, and 40 North Ventures. This series C brings the New York-based company’s total raised to $400 million.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,682 | 2,020 |
"MIT presents AI frameworks that compress models and encourage agents to explore | VentureBeat"
|
"https://venturebeat.com/2020/04/28/mit-presents-ai-frameworks-that-compress-models-and-encourage-agents-to-explore"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages MIT presents AI frameworks that compress models and encourage agents to explore Share on Facebook Share on X Share on LinkedIn Pepper the robot from Softbank Robotics Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
In a pair of papers accepted to the International Conference on Learning Representations (ICLR) 2020 , MIT researchers investigated new ways to motivate software agents to explore their environment and pruning algorithms to make AI apps run faster. Taken together, the twin approaches could foster the development of autonomous industrial, commercial, and home machines that require less computation but are simultaneously more capable than products currently in the wild. (Think an inventory-checking robot built atop a Raspberry Pi that swiftly learns to navigate grocery store isles, for instance.) ‘Curiosity’ algorithms One team created a meta-learning algorithm that generated 52,000 exploration algorithms, or algorithms that drive agents to widely explore their surroundings. Two they identified were entirely new and resulted in exploration that improved learning in a range of simulated tasks — from landing a moon rover and raising a robotic arm to moving an ant-like robot.
The team’s meta-learning system began by choosing a set of high-level operations (e.g., basic programs, machine learning models, etc.) to guide an agent to perform various tasks, like remembering previous inputs, comparing and contrasting current and past inputs, and using learning methods to change its own modules. Sourcing from nearly three dozen operations in total, the meta-learning system combined up to seven at a time to create computation graphs describing the aforementioned 52,000 algorithms.
Testing all of the algorithms would have required decades, so the coauthors limited their search for the best by eliminating algorithms predicted to perform poorly based on their code structure. Then the team tested the most promising candidates on a basic grid-level navigation task that required substantial exploration but minimal computation. The performance of candidates that did well became the new benchmark, eliminating even more candidates as time went on.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! According to the researchers, four machines searched for over 10 hours to find the best algorithms. Over 100 were high-performing, and the top 16 were both useful and novel, performing as well as (or better than) human-designed algorithms.
The team attributes the top 16 models’ performance to the two exploration functions they share. In the first, an agent is rewarded for visiting new places where it has a greater chance of making a move. In the second, an AI model learns to predict the future state of an agent while a second model recalls its past, and they work in tandem to predict the present such that if the prediction is erroneous, both reward themselves as a sign that they have discovered something new.
The researchers note that because the meta-learning process generates high-level computer code as output, both algorithms can be dissected to peer inside their decision-making processes. “The algorithms we generated could be read and interpreted by humans, but to actually understand the code we had to reason through each variable and operation and how they evolve with time,” said MIT graduate student Martin Schneider in a statement. He coauthored the study with fellow graduate student Ferran Alet and MIT professors of computer science and electrical engineering Leslie Kaelbling and Tomás Lozano-Pérez. “It’s an interesting open challenge to design algorithms and workflows that leverage the computer’s ability to evaluate lots of algorithms and our human ability to explain and improve on those ideas.” Shrinking AI models In the second of the two studies, an MIT team describes a framework that reliably compresses models so that they’re able to run on resource-constrained devices. While the researchers admit that they don’t understand why it works as well as it does, they claim it’s easier and faster to implement than other compression methods, including those that are considered state of the art.
The framework is an outgrowth of the “Lottery Ticket Hypothesis,” a paper showing that a model can perform well with 90% fewer elements if the right submodel is identified during training. The coauthors of this study — who not-so-coincidentally authored “Lottery Ticket Hypothesis” — propose “rewinding” a model to its earlier training state without any parameters (i.e., configuration variables internal to the model whose values can be estimated from the given data) before retraining it. Such pruning methods typically cause models to become less accurate over time, but this one manages to restore them to nearly their original accuracy.
That’s good news for the broader AI research field, whose accessibility and sustainability issues remain for the most part unresolved. Last June, researchers at the University of Massachusetts at Amherst released a study estimating that the amount of power required for training and searching a certain model involves the emission of roughly 626,000 pounds of carbon dioxide — equivalent to nearly 5 times the lifetime emissions of the average U.S. car. And according to a recent Synced report , the University of Washington’s Grover machine learning model, which is designed to both generate and detect fake news, cost $25,000 to train over the course of two weeks.
“I’m happy to see new pruning and retraining techniques evolve,” said MIT assistant professor Song Han, who built the industry-standard pruning algorithm AMC but wasn’t involved with this particular study. He recently coauthored a paper describing an AI training technique that improves efficiency with a large model comprising many pretrained submodels that can be tailored to a range of platforms. “[It will give] more people access to high-performing AI applications.” MIT Ph.D. student Alexa Renda coauthored the work with MIT assistant professor and fellow Ph.D. student Jonathan Frankle. Both are members of MIT’s Computer Science and Artificial Science Laboratory (CSAIL).
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,683 | 2,020 |
"Allen Institute researchers find pervasive toxicity in popular language models | VentureBeat"
|
"https://venturebeat.com/2020/09/25/allen-institute-researchers-find-pervasive-toxicity-in-popular-language-models"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Allen Institute researchers find pervasive toxicity in popular language models Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Researchers at the Allen Institute for AI have created a data set — RealToxicityPrompts — that attempts to elicit racist, sexist, or otherwise toxic responses from AI language models, as a way of measuring the models’ preferences for these responses. In experiments, they claim to have found that no current machine learning technique sufficiently protects against toxic outputs, underlining the need for better training sets and model architectures.
It’s well-established that models amplify the biases in data on which they were trained. That’s problematic in the language domain, because a portion of the data is often sourced from communities with pervasive gender, race, and religious prejudices. AI research firm OpenAI notes that this can lead to placing words like “naughty” or “sucked” near female pronouns and “Islam” near words like “terrorism.” Other studies, like one published by Intel, MIT, and Canadian AI initiative CIFAR researchers in April, have found high levels of stereotypical bias from some of the most popular models, including Google’s BERT and XLNet , OpenAI’s GPT-2 , and Facebook’s RoBERTa.
The Allen Institute researchers designed RealToxicityPrompts to measure the risk of “toxic degeneration” by pretrained language models, or models fed data sets containing thousands to billions of documents. They compiled a list of 100,000 naturally occurring prompts extracted from a large corpus of English Reddit text (the open source Open-WebText Corpus) and paired it with toxicity scores from Google’s Perspective API, which uses machine learning models to detect the potential toxicity of a comment.
The coauthors evaluated five language models using RealToxicityPrompts, specifically three models from OpenAI (GPT-1 GPT-2, and GPT-3 ) and two models from Salesforce (CTRL and CTRL-Wiki). The found that while toxic prompts — prompts offensive or stereotypically biased on their face — were 70% or more likely to yield toxic content from the language models, even non-toxic prompts resulted in offensive responses. The results show that all models were 49% or more likely to answer non-toxic content with toxic responses, even models like CTRL-Wiki that were only trained on Wikipedia data.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! To uncover the potential reasons for this, the researchers investigated the corpora used to pretrain several of the language models: OpenAI-WT (GPT-2’s training data) and OWTC (an open source fork of OpenAI-WT). OWTC contains text from Reddit posts with a karma of 3 or higher and 38GB of English documents, including news articles. OpenAI-WT — which has a 29% overlap with OWTC, such that at least 2.3 million documents in OpenAI-WT also appear in OWTC — contains about 8 million documents filtered using a blocklist of sexually explicit and otherwise offensive subreddits.
The researchers found that OWTC and OpenAI-WT contain “non-negligible” amounts of toxicity as identified by the Perspective API. About 2.1% of documents in OWTC were offensive compared with 4.3% in OpenAI-WT, or twice that of OWTC despite the blocklist. Unreliable news sites were another major source of toxicity in the data sets, as were posts from banned or quarantined subreddits. In fact, 63,000 documents in OpenAI-WT and OWTC came from links shared on problematic Reddit communities; GPT-2 was pretrained on at least 40,000 documents from the quarantined /r/The_Donald and 4,000 documents from the banned /r/WhiteRights.
“Overall, our investigations demonstrate that toxicity is a prevalent issue in both neural language generation and web text corpora,” the coauthors wrote in a paper describing their work. “Although they show some reduction in toxicity, steering methods do not fully protect neural models from toxic degeneration. Additionally, the corpora that language models are pretrained on contain non-negligible amounts of toxic, abusive, and untrustworthy content.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,684 | 2,021 |
"AI Weekly: Continual learning offers a path toward more humanlike AI | VentureBeat"
|
"https://venturebeat.com/2021/04/09/ai-weekly-continual-learning-offers-a-path-toward-more-humanlike-ai"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI Weekly: Continual learning offers a path toward more humanlike AI Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
State-of-the-art AI systems are remarkably capable, but they suffer from a key limitation: statisticity. Algorithms are trained once on a dataset and rarely again, making them incapable of learning new information without retraining. This is as opposed to the human brain, which learns constantly, using knowledge gained over time and building on it as it encounters new information. While there’s been progress toward bridging the gap, solving the problem of “continual learning” remains a grand challenge in AI.
This challenge motivated a team of AI and neuroscience researchers to found ContinualAI , a nonprofit organization and open community of continual and lifelong learning enthusiasts. ContinualAI recently announced Avalanche, a library of tools compiled over the course of a year from over 40 contributors to make continual learning research easier and more reproducible. The group also hosts conference-style presentations, sponsors workshops and AI competitions, and maintains a repository of tutorial, code, and guides.
As Vincenzo Lomonaco, cofounding president and assistant professor at the University of Pisa, explains, ContinualAI is one of the largest organizations on a topic its members consider fundamental for the future of AI. “Even before the COVID-19 pandemic began, ContinualAI was funded with the idea of pushing the boundaries of science through distributed, open collaboration,” he told VentureBeat via email. “We provide a comprehensive platform to produce, discuss and share original research in AI. And we do this completely for free, for anyone.” Even highly sophisticated deep learning algorithms can experience catastrophic learning or catastrophic interference, a phenomenon where deep networks fail to recall what they’ve learned from a training dataset. The result is that the networks have to be constantly reminded of the knowledge they’ve gained or risk becoming “stuck” with their most recent “memories.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! OpenAI research scientist Jeff Clune, who helped to cofound Uber AI Labs in 2017, has called catastrophic forgetting the “Achilles’ heel” of machine learning and believes that solving it is the fastest path to artificial general intelligence (AGI). Last February, Clune coauthored a paper detailing ANML, an algorithm that managed to learn 600 sequential tasks with minimal catastrophic forgetting by “meta-learning” solutions to problems instead of manually engineering solutions. Separately, Alphabet’s DeepMind has published research suggesting that catastrophic forgetting isn’t an insurmountable challenge for neural networks. And Facebook is advancing a number of techniques and benchmarks for continual learning, including a model that it claims is effective in preventing the forgetting of task-specific skills.
But while the past several years have seen a resurgence of research into the issue, catastrophic forgetting largely remains unsolved, according to Keiland Cooper, a cofounding member of ContinualAI and a neuroscience research associate at the University of California, Irvine. “The potential of continual learning exceeds catastrophic forgetting and begins to touch on more interesting questions of implementing other cognitive learning properties in AI,” Cooper told VentureBeat. “Transfer learning is one example, where when humans or animals learn something previously, sometimes this learning can be applied to a new context or aid learning in other domains … Even more alluring is that continual learning is an attempt to push AI from narrow, savant-like systems to broader, more general ones.” Even if continual learning doesn’t yield the sort of AGI depicted science fiction, Cooper notes that there are immediate advantages to it across a range of domains. Cutting-edge models are being trained on increasingly larger datasets in search of better performance, but this training comes at a cost — whether waiting weeks for training to finish or the impact of the electricity usage on the environment.
“Say you run a certain AI organization that built a natural language model that was trained over weeks on 45 terabytes of data for a few million dollars,” Cooper explained. “If you want to teach that model something new, well, you’d very likely have to start from scratch or risk overwriting what it had already learned, unless you added continual learning additions to the model. Moreover, at some point, the cost to store that data will be exceedingly high for an organization, or even impossible. Beyond this, there are many cases where you can only see the data once and so retraining isn’t even an option.” While the blueprint for a continual learning AI system remains elusive, ContinualAI aims to connect researchers and stakeholders interested in the area and support and provide a platform for projects and research. It’s grown to over 1,000 members in the three years since its founding.
“For me personally, while there has been a renewed interest in continual learning in AI research, the neuroscience of how humans and animals can accomplish these feats is still largely unknown,” Cooper said. “I’d love to see more of an interaction with AI researchers, cognitive scientists, and neuroscientists to communicate and build upon each of their fields ides towards a common goal of understanding one of the most vital aspects of learning and intelligence. I think an organization like ContinualAI is best positioned to do just that, which allows for the sharing of ideas without the boundaries of the academic or industry walls, siloed fields, or distant geolocation.” Beyond the mission of dissemination information about continual learning, Lomonaco believes that ContinualAI has the potential to become a reference points for a more inclusive and collaborative way of doing research in AI. “Elite university and private company labs still work mostly behind close doors, [but] we truly believe in inclusion and diversity rather than selective elitiarity. We favor transparency and open-source rather than protective IP licenses. We make sure anyone has access to the learning resources she needs to achieve her potential.” For AI coverage, send news tips to Kyle Wiggers — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI channel, The Machine.
Thanks for reading, Kyle Wiggers AI Staff Writer VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,685 | 2,017 |
"How AI, AR, and VR are making travel more convenient | VentureBeat"
|
"https://venturebeat.com/2017/08/03/how-tech-is-making-travels-inconveniences-much-more-convenient"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest How AI, AR, and VR are making travel more convenient Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
From 50 ways to leave your lover, as the song goes, to 750 types of shampoos, we live in an endless sea of choices. And although I haven’t been in the market for hair products in a while, I understand the appeal of picking a product that’s just right for you, even if the decision-making is often agonizing. This quandary (the “Goldilocks Syndrome”, of finding the option that is “just right”) has now made its way to the travel industry, as the race is on to deliver highly personalized and contextual offers for your next flight, hotel room or car rental.
Technology, of course, is both a key driver and enabler of this brave new world of merchandising in the travel business. But this is not your garden variety relational-databases-and-object-oriented-systems tech. What is allowing airlines, hotels and other travel companies to behave more like modern-day retailers is the clever use of self-learning systems, heuristics trained by massive data sets and haptic-enabled video hardware. Machine learning (ML), artificial intelligence (AI), augmented reality (AR) and virtual reality (VR) are starting to dramatically shape the way we will seek and select our travel experiences.
Let every recommendation be right AI is already starting to change how we search for and book travel. Recent innovation and investment has poured into front-end technologies that leverage machine learning to fine tune search results based on your explicit and implicit preferences. These range from algorithms that are constantly refining how options are ranked on your favorite travel website, to apps on your mobile phone that consider past trips, expressed sentiment (think thumbs up, likes/dislikes, reviews) and volunteered information like frequent traveler numbers.
Business travel, as well, is positioned for the application of AI techniques , even if not all advances are visible to the naked eye. You can take photos of a stack of receipts on your smartphone; optical character recognition software codifies expense amounts and currencies, while machine learning algorithms pick out nuances like categories and spending patterns.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! AI is also improving efficiencies in many operational systems that form the backbone of travel. Machine learning is already starting to replace a lot of rule-based probabilistic models in airport systems to optimize flight landing paths to meet noise abatement guidelines, or change gate/ramp sequencing patterns to maximize fuel efficiency.
Making decisions based on reality VR and AR are still changing and evolving rapidly, with many consumer technology giants publicly announcing products this year we can expect to see rapid early adoption and mainstreaming of these technologies. Just as music, photos, videos and messaging became ubiquitous thanks to embedded capabilities in our phones, future AR and VR applications are likely to become commonplace.
VR offers a rich, immersive experience for travel inspiration, and it is easy to imagine destination content being developed for a VR environment. But VR can also be applied to travel search and shopping. My company, Amadeus, recently demonstrated a seamless flight booking experience that includes seat selection and payment. Virtually “walking” onto an airplane and looking a specific seat you are about to purchase makes it easier for consumers to make informed decisions, while allowing airlines to clearly differentiate their premium offerings.
AR will probably have a more immediate impact than VR, however, in part due to the presence of advanced camera, location and sensor technology already available today on higher-end smartphones.
Airports are experimenting with beacon technology where an AR overlay would be able to easily and quickly guide you to your tight connection for an onward flight, or a tailored shopping or dining experience if you have a longer layover.
“Any sufficiently advanced technology is indistinguishable from magic,” goes Arthur C. Clarke’s famously quoted third law. But as we expect more authentic experiences: precise search results, an informed booking or an immersive travel adventure, we can count on increasingly magical technology from systems that learn to deliver us our “perfect bowl of porridge.” Rashesh Jethi heads up the Amadeus Research & Development (R&D) team for the Americas.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,686 | 2,020 |
"Retail’s adapt-or-die moment: How low-code apps can speed transformation | VentureBeat"
|
"https://venturebeat.com/2020/12/14/retails-adapt-or-die-moment-how-low-code-apps-can-speed-transformation"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Retail’s adapt-or-die moment: How low-code apps can speed transformation Share on Facebook Share on X Share on LinkedIn Presented by FollowAnalytics In 2013 billionaire tech entrepreneur and leading Silicon Valley investor Marc Andreessen predicted a high-drama outcome for retail.
As he put it: “Retail guys are going to go out of business, and ecommerce will become the place everyone buys. You are not going to have a choice.” Fast forward, and his prediction is more direct than dire. The global pandemic has turned the “pre-death” stage for retail into a potential extinction moment for many major brands and chains.
Today, we know retail’s New Normal is mobile The meteoric growth of mobile commerce has prompted analysts to readjust forecasts upwards, proclaiming 2020 the biggest year ever for mobile and apps. Research firm eMarketer expects mobile shopping sales to reach $314 billion in 2020, which is $200 billion more than four years ago and represents over 44% of all ecommerce sales. App store intelligence provider App Annie reports U.S. consumers will spend 1 billion hours shopping on Android devices alone, a 50% increase from Q4 2019.
The holiday shopping season has further accelerated the shift to smartphones and apps. Preliminary data from Adobe Analytics shows U.S. consumers spent $6.3 million per minute online on Black Friday alone. Spending on smartphones surged 25.3% year over year to reach $3.6 billion, accounting for 40% of total online spend.
Meanwhile, mobile video advertising company AdColony highlights the pivotal place of mobile apps in the path-to-purchase. It reports that almost half (46%) of holiday shoppers in North America chose in-app over a mobile browser. Ease of use and simple navigation top the list of features that keep consumers browsing and buying on their mobile apps (50%), the research says. An easy payment process is important for 25% of consumers, followed by exclusive offers and coupons (14%).
The findings spotlight the key features (or lack of them) that will make or break retailers this holiday season — and beyond.
But there’s always a bright future for the best-in-class The high-performing companies leading the way deliver mobile app experiences that support shoppers at every step of the journey, from consideration to conversion. The wisdom of this approach is clear in light of Deloitte’s findings that show mobile apps are the starting point for nearly half (46%) of shoppers to do research before making a purchase decision.
Market-leading retailers also embrace what I call an “always-on innovation” mindset. They add features such as BOPIS (buy online, pick up in store) and deep integration with leading digital commerce platforms such as Shopify, Salesforce Commerce Cloud, and Magento that anticipate, as well as answer, shopper requirements.
How can retailers adapt their apps to encourage interaction, drive transactions, and consistently provide value that will keep customers coming back? The lessons I’ve learned from working with our FollowAnalytics customers are a helpful guide.
As inspiration for companies during year-end strategy planning, I outline the 3Fs, steps consumer-facing brands and retailers can follow to architect an app experience that drives customer connection and conversions.
1. Friction-free: Reduce hassle to accelerate revenues Apps empower customers to browse and buy on their terms. Free of high-friction physical boundaries such as checkout lanes or parking lots, consumers no longer need to go shopping.
They are always shopping.
The good news: Shopping in the digital realm is growing nearly 5x faster than physical store sales. The not-so-good news: Consumers are continually raising the bar on customer experience. One way to meet expectations is with a simple shortcut that allows shoppers to order via the app and then pick up their purchase at or just outside the store.
Retail reimagined: The new era for customer experience , a new report from Periscope by McKinsey, tells us the appeal of this option among shoppers in the U.S. “increased by 13 percentage points.” Internal data at FollowAnalytics supports the massive positive impact, highlighting record demand for features that help shoppers manage time, not waste it.
Jessica Alba’s The Honest Company , which creates and sells eco-friendly and convenient products for babies and homes, leveraged the FollowAnalytics platform to quickly turn its ecommerce environment into a mobile app experience low on friction and high on engagement. From a custom user interface to the integration with Salesforce Commerce Cloud, the retailer provides a prime example of how apps adapt to customer requirements because they must.
Taking the friction out of the journey isn’t a nice-to-have. It’s what it takes to meet ever-evolving customer experience requirements and remain relevant.
2. Feature-rich: Adapt to customer preferences to drive returns From payment systems that allow shoppers to check out with Apple Pay and Google Pay to augmented reality (AR) systems that allow consumers to try on clothing or test products, baking the right features into your app experience (in the right combination) unlocks revenues.
Market-leading brands and retailers need no convincing. A great example is Nike. The largest seller of athletic footwear and athletic apparel in the world activated its digital community by offering virtual workouts and saw an 80% increase in weekly active users of its mobile app. But market giants aren’t the only ones winning big by adding key features at the speed of change.
In some cases, features are a source of competitive advantage that let smaller companies play in the major leagues. A prime example is FITNESS SF.
The locally owned and operated fitness center is punching far above its weight thanks to a hybrid app that makes virtual training truly effective and engaging. Livestream classes, one-on-one virtual training, and personalized meal recommendations top a detailed list of crowd-pleasing features that have enabled the company to keep customers coming back.
FITNESS SF Vice President Don Dickerson told Low Code Ninjas (a FollowAnalytics podcast) that the app also allows the company to expand its offer and ecosystem. “Nutrition is a huge part of fitness and now we’re able to deliver recipes and nutritional advice, a meal plan, and the ability to order online all through our mobile app.” Moving forward, Dickerson sees an opportunity to “become a lifestyle brand” and take a more active role in customers’ lives. It’s an ambitious goal — and well within reach, because the pandemic has taught FITNESS SF and many of our customers to expect and embrace the unexpected.
You don’t know what’s coming tomorrow, so it’s critical to operate in a low-code environment where you can iterate at a very rapid clip.
3. Fast: Innovate at the speed of change Digital transformation has been accelerated by a factor of 10, obliterating the linear path-to-purchase. Recent data from McKinsey and Company reveals consumers are switching brands at an unprecedented rate. The outcome is a “shattering of brand loyalties.” But that’s just one of the behavior shifts putting the squeeze on retail. Brands and retailers are also under pressure to develop a dynamic way of making the match between the experience their app offers and what their customers want and need.
That’s where low-code puts companies on the fast track to create what Seth Winters, VP of Digital Innovation at doTERRA , a manufacturer and retailer of essential oils and nutritional supplements, calls a “blended experience.” In this scenario, retailers don’t reinvent shopping. They reorient the experience by harnessing hybrid technology to deploy tailored experiences that consumers appreciate at the moment of inspiration.
It enables retailers to showcase offers and features consumers crave. More importantly, it equips companies to achieve positive results in record time. In a recent interview with Low Code Ninjas , Winters recalls that low-code shaved nearly four years off the time needed to deliver shoppers a genuinely useful app.
That would have been a long wait in a market where timing is everything. To complicate matters further, retailers are competing with all companies everywhere on the planet for the talent and tools to accelerate digital transformation.
That, according to IDC , is going to require a massive number of new apps in an incredibly short time. The global provider of market intelligence estimates 500 million digital apps and services will be created by 2023. That’s the same number of apps developed in the last 40 years. Given the impending app gap that will see some businesses left behind in the race to innovate, low-code offers a faster pathway to profits.
Microsoft Chief Executive Officer Satya Nadella takes it a step further. He advises enabling a new category of developers equipped with “ tools that are low-code or no-code to create solutions that solve their unique business needs.
” We have vaulted ten years ahead in consumer and business digital penetration in less than three months. The question is no longer: Is a mobile app a must? The question now is: How quickly can companies deliver in order to extract the maximum value from their mobile app? We present a helpful blueprint and actionable advice in our Low Code Explosion report. It’s an essential read and a fast one as well ;-) Samir Addamine is Founder of FollowAnalytics.
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected].
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,687 | 2,020 |
"How RPA could help distribute and track COVID-19 vaccines | VentureBeat"
|
"https://venturebeat.com/2020/12/27/how-rpa-could-help-distribute-and-track-covid-19-vaccines"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How RPA could help distribute and track COVID-19 vaccines Share on Facebook Share on X Share on LinkedIn A computer image of the type of virus linked to COVID-19.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
The first shipments of Pfizer’s and Moderna’s COVID-19 vaccines arrived at health systems across the U.S. this month, a significant milestone in the fight against a pandemic that has infected and killed millions of people. But vaccine distribution and administration is a major logistical challenge, not least because both vaccines require cold storage. Nevertheless, Operation Warp Speed, the U.S. public-private COVID-19 treatment partnership, optimistically aims to vaccinate tens of millions of people by year’s end.
Some stakeholders are turning to AI for help, particularly robotic process automation, or RPA — software that emulates the actions of humans interacting with systems. RPA could bolster efficiency in hospitals and supply chains overwhelmed by the challenges of COVID-19 vaccine management.
For example, San Jose-based Automation Anywhere , which last year raised $290 million in venture capital at a $6.8 billion valuation, says it is working with a pharmaceutical company in Europe to implement 25 bots and automate 65 processes. According to Dr. Yan Chow, global health care industry leader at Automation Anywhere, the project lead realized RPA could help researchers and labs run more efficiently to accelerate research and approval of vaccines by augmenting reporting processes.
“Intelligent automation will play a key role in streamlining the complicated logistics required for the first doses of vaccines to reach our most vulnerable populations,” Chow told VentureBeat via email. “As vaccine enrollment opens to the public, bots can automate the enrollment process — auto-populating registration forms to help reduce attrition.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! On the transportation and allotment side of the vaccine distribution challenge, firms including Itelligence Benelux, a SAP partner based in the Netherlands, say RPA could have a significant role to play. At a recent SAP hackathon, Itelligence proposed combining RPA with internet-connected sensor tags that would continuously measure the temperature of vaccine batches and send the data via Bluetooth to a smartphone. A sensor app would receive the data and forward it to a cloud platform, which would add batch and product information, as well as temperature thresholds and relevant delivery details. Another app would then check the incoming sensor values against predefined thresholds, like the temperature limit, and kickstart one or more RPA bots.
In Itelligence’s blueprint , which won the hackathon, a warning limit might trigger text messages and emails to inform truck drivers and transport companies about batch risks. A spoil limit, which would come into play when the vaccine could no longer be saved, might trigger multiple bots for a broader set of actions, like taking the batch out of the cold chain, creating a replacement order, and initiating a secondary order to return the spoiled vaccines to the supplier. RPA would also help create the necessary insurance documents and send them to the relevant carriers.
“Conventional data loggers — small devices that normally travel with the shipped batches of goods — should ensure safety. However, they have a significant disadvantage: They report problems very late. Namely, when the recipient reads the data after the batch has arrived, i.e. when it is already spoiled,” Danny Groothuis, one of the creators of the blueprint, wrote in a blog post. “As this scenario shows, the smart combination of IoT devices and SAP Intelligent Robotic Process Automation bots can reduce a lot of repetitive and tedious work and provide considerably more security and efficiency to cold chain management.” Once the vaccines arrive at the point of care, stakeholders must contend with another set of challenges: tracking which patients have — and have not — received them. Both the Pfizer and Moderna vaccines must be taken in two doses three to four weeks apart, and health systems are responsible for maintaining their own tracking databases or opting for prebuilt state and federal solutions.
Olive, a Columbus-based health care automation startup, has a framework that’s being adapted for this purpose. Using a combination of computer vision and RPA, Olive told TechRepublic its tools have supported COVID-19 testing operations in health systems by automating manual data entry. In the next phase of its work, the company says it will track which frontline workers receive the vaccine and when, something it expects will reduce administration time and help monitor vaccine recipients for side effects while providing data to government regulators.
Automation Anywhere has already deployed an RPA-based vaccine-tracking solution, albeit for a different vaccine. Earlier this month, the company announced a collaboration with Newcastle Upon Tyne Hospitals NHS Foundation Trust, a teaching hospital in the U.K., to launch a flu-reporting bot that tracks vaccinations among its more than 14,000 employees. Automation Anywhere says the bot has captured updates for more than 10,000 staff vaccinations and saved nearly three months of cumulative admin time. If the rollout continues to go smoothly, Newcastle Upon Tyne Hospitals says it might consider extending the technology to support local “test and trace” processes and report on COVID-19 vaccinations.
“Bots will play an important role in ensuring pharmacovigilance. If a person who received the COVID-19 vaccine has an adverse effect, bots can automatically handle complaints associated with the new vaccine and automate the adverse event process,” Automation Anywhere life sciences global lead Catherine Calarco told VentureBeat. “For example, within a month of the vaccine being distributed — someone has issues related to the vaccine and contacts their physician and the pharma company. It’s important to rapidly manage these field events and quickly see any emerging trends. Automation increases accuracy and reduces cycle time for case management, resulting in better patient care.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,688 | 2,021 |
"Top 10 cybersecurity lessons learned one year into the pandemic | VentureBeat"
|
"https://venturebeat.com/2021/03/11/top-10-cybersecurity-lessons-learned-one-year-into-the-pandemic"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Top 10 cybersecurity lessons learned one year into the pandemic Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
In 2020, chief information security officers (CISOs), chief information officers (CIOs), and their cybersecurity teams faced a digital pandemic of breaches, widespread supply chain attacks, and ingenious uses of human engineering to compromise enterprise systems. Bad actors were quick to capitalize on the chaos the COVID-19 pandemic created in order to compromise as many valuable enterprise systems as possible. The number of breaches soared as attackers targeted the millions of remote workers who didn’t have adequate security protection or sufficient training to be able to spot hacking and phishing attempts.
The findings from PwC’s 2021 Global Digital Trust Insights: Cybersecurity Comes of Age study and the conversations VentureBeat has had with CISOs in the last year tell the same story: Enterprises are most concerned with protecting their cloud infrastructure from endpoint-based attacks.
Enterprises fast-track cybersecurity as a top goal According to PwC’s 2021 Global Digital Trust Insights report, 96% of business and technology executives prioritized their cybersecurity investments due to COVID-19 and its impact on their organizations this year. The report is based on interviews with 3,249 business and technology executives worldwide, and half of the surveyed executives said cybersecurity and privacy were being included in every business decision and plan. In 2019, that figure was closer to 25%.
While 64% of enterprise executives expect revenues to decline, 55% said their cybersecurity budgets will increase this year. To further accentuate how vital cybersecurity is to enterprises, 51% said they plan to add full-time cybersecurity staff this year.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Above: More executives are increasing their cybersecurity budgets than decreasing them in 2021. (Source: PwC 2021 Global Digital Trust Insights Survey) Gartner’s 2021 Boards of Director’s Survey and VentureBeat’s conversations with CISOs, CIOs, and their teams over the past three months also corroborate PwC’s claim that cybersecurity spending is going up and being fast-tracked even in enterprises that expect revenues to decline. Gartner’s survey also had the following to say: Boards of directors and senior management teams see cyber-risks as the hardest to protect against and the most potentially lethal and damaging to current and future revenue streams.
Boards’ interest in and support of security and risk management strategies is at an all-time high today, with a strong focus on how to reduce the incidence of human-engineered attacks succeeding against their enterprises.
By 2025, 40% of boards of directors will have a dedicated cybersecurity committee overseen by a qualified board member, up from less than 10% today.
By 2024, 60% of CISOs will need to establish critical partnerships with key executives in sales, finance, and marketing, up from less than 20% today as the business case for cybersecurity becomes more integral to the success of an enterprise.
Top cybersecurity lessons learned in 2020 Enterprises had to reinvent themselves in record time to keep running and be digitally adept as offices closed, and stayed closed. As a result, enterprises are now seven years ahead of schedule on their digital transformation initiatives, according to McKinsey’s recent COVID-19 survey. Record ecommerce revenue results for 2020 reflect the success of that effort for many organizations. On the flip side, the fact there were many cybersecurity incidents — many still unsolved — reflect the failures of that effort.
Bad actors’ abilities to home in on the cybersecurity gaps, in both systems and people, proved unerringly accurate in 2020. Of the many lessons learned in 2020, perhaps the most valuable is that the human element must come first. The following are the top 10 lessons learned one year into the pandemic, according to CISOs, CIOs, and their teams: Real-world supply chains are vulnerable to cyberattacks.
Cybercriminals and advanced persistent threat (APT) groups are masquerading as trusted entities (pharmaceutical companies and health care providers, for example) to obtain privileged access credentials in attacks against the COVID-19 vaccine supply chain, according to the COVID-19 Exploited by Malicious Cyber Actors threat analysis from U.S. Department of Homeland Security’s Cybersecurity & Infrastructure Security Agency (CISA). The attackers rely on techniques such as phishing, malware distribution, impersonating legitimate domain names by using terms related to COVID-19, and attacking remote access and teleworking infrastructure. A global phishing campaign targeted the COVID-19 vaccine cold chain in 2020, according to IBM Security X-Force’s threat intelligence task force tracking COVID-19 vaccine cyber threats. Privileged access management (PAM) is an area that survived IT budget cuts last year, CISOs told VentureBeat. Leaders in this area include BeyondTrust, Centrify, CyberArk, and Thycotic.
Virtual workforces make self-diagnosing and self-remediating endpoints a necessity.
With so much of the workforce operating virtually, endpoint protection is more important than ever. Endpoint protection platforms must be capable of securely configuring, patching, and managing operating systems and applications. That must include updating the security protocols, as well. Leaders in this area include Microsoft, CrowdStrike, Symantec, Trend Micro, and Sophos. In Absolute Software’s approach, the protection is embedded in the BIOS of devices from Dell, HP, Lenovo, and 23 other manufacturers to provide useful asset management data and continuous protection.
Touchless commerce means QR codes are now the fastest growing threat vector.
In 2020, businesses switched to QR codes for touchless transactions, and fraudsters capitalized on that trend.
This shift makes unified endpoint management (UEM), passwordless multifactor authentication (Zero Sign-On), and mobile threat defense (MTD) essential for mobile devices. Fraudsters combined social engineering with easily created QR codes to access and drain victims’ bank accounts, install malware on devices, and penetrate entire corporate networks. Malicious QR codes can be used to open webpages, make a payment, or send messages without the user’s authorization, according to Ivanti’s QR Codes: Consumer Sentiment Survey.
Cyberattacks against managed service providers (MSPs) are growing.
MSPs are attractive because once a cybercriminal gains access to the MSP’s internal systems, all the customers are exposed.
In 2020 cybercriminal gangs and state-sponsored hacking groups targeted MSPs with greater intensity than in previous years to gain access to the larger organizations that are their clients. “Threat actors are using hacked MSPs to launch cyberattacks against service provider customers’ point-of-sale (POS) systems and perform business email compromise (BEC) and ransomware attacks,” the United States Secret Service said in the Compromise Managed Service Providers information alert on June 12.
The National Cybersecurity Center for Excellence and the National Institute of Standards and Technology has published recommendations for MSPs on how to defend against and recover from a breach. Recommendations include encrypting all data at-rest or in-transit to prevent data disclosure, both accidental and malicious. Vendors who provide cloud-based key management systems that support multi-cloud configurations include Fortanix, Micro Focus, Sepior, Thales, Townsend Security, and Utimaco.
Attackers can compromise the software supply chain and modify executables.
The SolarWinds breach showed that state-sponsored actors can penetrate the software supply chain and modify the executable files, all the while mimicking protocol traffic to avoid detection. Enterprise software companies, especially those involved in cybersecurity, need to design preventive privileged access controls into their DevOps process and strengthen them with detection-based controls (often included in privileged identity management platforms). SolarWinds taught everyone that having multiple preventive controls as part of a PIM strategy is essential. Key elements include having strong passwords, rotating passwords, adopting federated credentials and multi-factor authentication (MFA), and requiring privileged users to log in as themselves for better auditing and accountability. Leaders in this field, according to The Forrester Wave: Privileged Identity Management (PIM), Q4 2020 , include CyberArk, BeyondTrust, Thycotic, and Centrify.
Above: The 10 providers that matter most and how they stack up. Source: The Forrester Wave: Privileged Identity Management (PIM), Q4 2020 Image Credit: Centrify Social engineering can compromise social media platforms.
Cyberattackers sold 267 million Facebook user profiles in criminal forums for $540.
High-profile Twitter accounts for celebrities and political figures were hijacked to promote a cryptocurrency scam. In the Twitter breach, the bad actors used several techniques to access accounts, including bribing Twitter employees to access privileged account credentials and administrative tools. These incidents highlighted a stark lesson on the value of MFA and PAM, and suggest it’s time for social media platforms to require MFA to create an account. Leading providers of MFA solutions include Microsoft, Duo Security, Okta, Ping Identity, and Symantec.
Use zero trust to manage machine identities.
IT teams rolling out IoT sensors and devices into the production environment need to micro-segment the devices in a manner consistent with the organization’s zero trust framework. Securing these devices by taking a least-privileged-access approach is a must-do to prevent malware-based botnet attacks.
The Mirai botnet was able to grow so large and powerful because so many machines and IoT devices did not follow the zero trust model and were deployed online with default security credentials. Leading zero trust security providers for machine identities, including bots, robots, and IoT, are BeyondTrust, Centrify, CyberArk, and Thycotic. Another to note is HashiCorp, which provides a purpose-built vault that scales to protect machine identities throughout DevOps cycles.
Bad actors turned health care records into best sellers.
From stealing laptops from medical centers to bribing medical staff for administrative logins and passwords, bad actors placed a high priority on stealing and selling protected health information (PHI).
One of the largest laptop-based breaches recently compromised 654,000 patient records after someone stole a laptop from a transportation vendor who works for the Health Share of Oregon. The records contained patient names, contact details, dates of birth, and Medicaid ID numbers. A quick scan of the U.S. Department of Health and Human Services (HHS) Breach Portal shows that the average stolen laptop in the health care industry contained over 69,000 available PHI records.
Cloud security misconfigurations are the leading cause of cloud data breaches.
Misconfigured cloud systems open up opportunities for bad actors to access password storage and password management systems. According to a survey of 300 CISOs, 8 in 10 U.S.-based companies have experienced a data breach due to misconfigured cloud servers and accounts. The top three cloud security threats are configuration errors in production environments, lack of visibility into who has access in production environments, and improperly configured identity access management (IAM) and permissions. What’s needed is continuous assessment and improvement of cloud security configurations throughout the life cycle of applications and platforms. Cloud security posture management (CSPM) platform providers include Alert Logic, CrowdStrike, Palo Alto Networks, Saviynt, Sonrai, and VMWare.
Infrastructure monitoring is essential for identifying anomalies.
Breaches happened because administrators either didn’t implement monitoring or did not configure it to find anomalous events. This is one aspect of how the human element was one of the major weak points in cybersecurity last year. Log monitoring systems are proving invaluable in identifying machine endpoint configuration and performance anomalies in real time. AIOps is proving effective in identifying anomalies and performance event correlations on the fly, contributing to greater business continuity. One of the leaders in this area is LogicMonitor, whose AIOps-enabled infrastructure monitoring and observability platform has proven successful in troubleshooting infrastructure problems and ensuring business continuity.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,689 | 2,021 |
"Is Boston Dynamics becoming a boring robotics company? | VentureBeat"
|
"https://venturebeat.com/2021/04/18/is-boston-dynamics-becoming-boring-robotics-company"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Is Boston Dynamics becoming a boring robotics company? Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Boston Dynamics has made a name for itself through fascinating videos of biped and quadruped robots doing backflips, opening doors, and dancing to “Uptown Funk.” Now, it has revealed its latest gadget: a robot that looks like a huge overhead projector on wheels.
It’s called Stretch, and it doesn’t do backflips, it doesn’t dance, and it’s made to do one task: moving boxes. It sounds pretty boring.
But this could, in fact, become Boston Dynamics’ most successful commercial product and turn it into a profitable company.
What does Stretch do? VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Stretch has a box-like base with a set of wheels that can move in all directions. On top of the base are a large robotic arm and a perception mast. The robotic arm has seven degrees of freedom and a suction pad array that can grab and lift boxes. The perception mast uses computer vision–powered cameras and sensors to analyze its surroundings.
While we have yet to see Stretch in action, according to information Boston Dynamics provided to the media, it can handle boxes weighing up to 23 kilograms and make 800 displacements per hour, and its battery can last eight hours. The video Boston Dynamics posted on its YouTube channel suggests the robot can reach the 800-cases-per-hour speed if everything remains static in its environment.
Traditional industrial robots must be installed in a fixed location, which puts severe limits on the workflows and infrastructure of the warehouses where they are deployed. Stretch, on the other hand, is mobile and can be used in many different settings with little prerequisite beyond a flat ground and a little bit of training (we still don’t know how the training works). This could be a boon for many warehouses that don’t have automation equipment and infrastructure.
As Boston Dynamics’ VP of business development Michael Perry told The Verge , “You can take this capability and you can move it into the back of the truck, you can move it into aisles, you can move it next to your conveyors. It all depends what the problem of the day is.” A boring but useful robot At first glance Stretch seems like a step back from the previous robots Boston Dynamics has created. It can’t navigate uneven terrain, climb stairs, jump on surfaces, open doors, and handle objects in complicated ways.
It did manage to do some amusing feats on its intro video, but we can’t expect it to be as entertaining as Spot, Atlas, and Handle.
But that’s exactly what real-world applications of robotics and artificial intelligence are all about. We still haven’t figured out how to create artificial general intelligence , the kind of AI that can mimic all aspects of the cognitive and physical abilities of humans and animals.
Current AI systems are robust when performing narrow tasks in stable environments but start to break when they’re forced to tackle various problems in unpredictable settings. Therefore, the success of AI systems is to find the right balance between versatility and robustness, especially in physical settings where safety and material damage are major concerns.
And Stretch exactly fits that description. It does a very specific task (picking up and displacing boxes) in a predictable environment (flat surfaces in warehouses).
Stretch might sound boring in comparison to the other things that Boston Dynamics has done in the past. But if it lives up to its promise, it can directly result in reduced costs and improved production for many warehouses, which makes it a viable business model and product.
As Perry told The Verge last June , “[A] lot of the most interesting stuff from a business perspective are things that people would find boring, like enabling the robot to read analogue gauges in an industrial facility. That’s not something that will set the internet on fire, but it’s transformative for a lot of businesses.” The competitive edge of Boston Dynamics Boston Dynamics is not alone in working on autonomous mobile robots for warehouses and other industrial settings. There are dozens of companies competing in the field, ranging from longstanding companies such as Honeywell to startups such as Fetch Robotics.
And unloading boxes is just one of the several physical tasks that are ripe for automation. There’s also a growing market for sorting robots, order-picking robots, and autonomous forklifts.
What would make Boston Dynamics a successful contender in this competitive market? The way I see it, success in the industrial autonomous mobile robots market will be defined by versatility/robustness threshold on the one hand and cost efficiency on the other. In this respect, Boston Dynamics has two factors working to its advantage.
First, Boston Dynamics will leverage its decades of experience to push the versatility of its robots without sacrificing their robustness and safety. Stretch has inherited technology and experience from Handle, Atlas, Spot, and other robots Boston Dynamics has developed. It also contains elements of Pick, a computer vision–based depalletizing solution mentioned in the press release that declared Hyundai’s acquisition of Boston Dynamics.
This can enable Stretch to work in a broader set of conditions than its competitors.
Second, the company’s new owner, Hyundai , is one of the leading companies in mobile robot research and development. Hyundai has already made extensive research in creating autonomous robots and vehicles that can navigate various environments and terrains. Hyundai also has a great manufacturing capacity. This will enable Boston Dynamics to reduce the costs of manufacturing Stretch and sell it at a competitive price. Hyundai’s manufacturing facilities will also enable Boston Dynamics to deliver new parts and props for Stretch at a cost-efficient price. This will further improve the versatility of the robot in the future and allow customers to repurpose it for new tasks without making large purchases.
The future of Boston Dynamics Stretch is the second commercial product of Boston Dynamics, the first one being the quadruped robot Spot.
But Spot’s sales were only covering a fraction of the company’s costs, which were at least $150 million per year when Hyundai acquired it. Stretch has a greater potential for making Boston Dynamics a profitable company.
How will the potential success of Stretch affect the future of Boston Dynamics? Here’s an observation I made last year after Hyundai acquired Boston Dynamics: “Boston Dynamics might claim to be a commercial company. But at heart, it is still an AI and robotics research lab. It has built its fame on its advanced research and a continuous stream of videos showing robots doing things that were previously thought impossible. The reality, however, is that real-world applications seldom use cutting-edge AI and robotics technology. Today’s businesses don’t have much use for dancing and backflipping robots.
What they need are stable solutions that can integrate with their current software and hardware ecosystem, boost their operations, and cut costs.” How will Stretch’s success affect Boston Dynamics’ plans for humanlike robots? It’s hard to remain committed to long-term scientific goals when you’re owned by a commercial enterprise that counts profits by the quarter.
But it’s not impossible. In the early 1900s, Albert Einstein worked as an assistant examiner at the Swiss patent office in Bern because physics research didn’t put food on his family’s table. But he remained a physicist at heart and continued his research in his idle time while his job as patent clerk paid the bills. His passion eventually paid off, earning him a Nobel prize and resulting in some of the greatest contributions to science in history.
Will Stretch and its successors become the norm for Boston Dynamics, or is this the patent-clerk job that keeps the lights on while Boston Dynamics continues to chase the dream of humanoid robots that push the limits of science? This story originally appeared on Bdtechtalks.com.
Copyright 2021 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,690 | 2,021 |
"OneStream: Data analysis, AI tools usage increased in 2021 | VentureBeat"
|
"https://venturebeat.com/2021/05/11/onestream-data-analysis-ai-tools-usage-increased-in-2021"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages OneStream: Data analysis, AI tools usage increased in 2021 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
CFOs and other finance executives are optimistic that economic recovery is on the horizon: three-quarters (73 percent) expect that they will return to normal growth by the end of 2021, according to the latest Enterprise Financial Decision-Making Report from OneStream , a provider of corporate performance management solutions for mid-sized and large enterprises. Companies have significantly increased their data analysis tool investments and usage over the past year, the report found.
Above: Over half of companies are using data analysis tools more than before the pandemic.
OneStream’s study targeted finance leaders across North America and identified the factors driving their priorities, budgets and technology adoption plans for 2021. The survey found that the COVID-19 pandemic created a heightened need for agile forecasting, predictive planning and digital transformation.
The ability to quickly reforecast budgets and shift workflows has become essential.
The 2021 report found that finance executives have significantly increased their data analysis tool investments and usage. Companies commonly invested in artificial intelligence (59 percent) and increased their use of cloud-based planning and reporting solutions (65 percent). Most companies already use (69 percent) or plan to use (18 percent) low-code development platforms, which enable business users and citizen developers to take on new roles while circumventing complicated coding requirements. For return-to-office budgets, data privacy tools are the most common priority (18%), followed by hybrid cloud technologies.
Compare the results with OneStream’s 2020 Enterprise Financial Decision-Making Report where less than half (46 percent) of the finance executives reported using cloud-based solutions regularly, while less than a quarter used machine learning (21 percent) and artificial intelligence (20 percent) solutions.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Many finance executives are evaluating their workforce, technology and supply chain needs for a post-pandemic reality. However, the political and social landscape have also heavily impacted investment decisions, leading executives to prioritize sustainability and diversity initiatives as well.
The commissioned study, conducted by Hanover Research in April of 2021, sourced insights from 340 finance decision makers in the United States, Canada and Mexico. All individuals hold management position (C-level executive (CFO), VP, Director, Controller) in finance. Respondents work at companies across numerous industries and varying revenues, with 24 percent employed by companies with over $1 billion in annual revenue.
Read the full OneStream report Enterprise Financial Decision-Making Report 2021 — North America.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,691 | 2,021 |
"Roambee adds AI analytics to supply chain tracking with Arnekt deal | VentureBeat"
|
"https://venturebeat.com/2021/05/20/roambee-adds-ai-analytics-to-supply-chain-tracking-with-arnekt-deal"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Roambee adds AI analytics to supply chain tracking with Arnekt deal Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
There’s no such thing as having too much data when it comes to logistics and supply chain tracking. That’s why supply chain visibility company Roambee has been adding ways to give enterprises greater visibility into how shipments move around the world.
Earlier this month, Roambee announced the acquisition of cloud analytics firm Arnekt. The deal would supplement Roambee’s location-aware, package-tracking Honeycomb platform with Arnekt’s natural language processing (NLP), data science, and AI capabilities.
“This addition will enable us to keep improving our customers’ bottom line when it comes to their deliveries,” Roambee CEO Sanjay Sharma told VentureBeat. “For example, a lot of retail companies run on tiny margins, and those can be negatively impacted with bad supply chain performance.” “They can only collect money if they invoice on time, but they need proof of delivery to do that. And proof of delivery is a broken process right now. The good news is that the friction in delivery systems can be eliminated via data collected by sensors, which can determine when boxes are delivered and in what condition, triggering an invoicing process automatically and improving cash cycles for companies,” Sharma said.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Roambee did not disclose financial details of the acquisition.
Arnekt adds machine learning to Honeycomb Roambee uses firsthand sensor data generated from a portfolio of purpose-built sensors, like its BeeSense and BeeTrac devices, combining that information with data from other available third-party data streams and devices to paint a comprehensive picture of where customer assets are, the condition they’re in, how secure they are, and more. Via Honeycomb, the company provides customers with visibility into their shipments, inventory, assets in the field , and returnable items.
With Arnekt, Roambee gains AI- and machine learning-infused frameworks that are used by such high-profile customers as Honda, Epson, and HPE.
“This acquisition enables us to scale and grow our offerings, such as the Swarm AI network analytics platform, to help customers optimize their supply chain operations,” Sharma said.
“So think of really big companies that ship huge amounts of goods around the world. Arnekt brings the capability of hyper-scaling process efficiencies based on the structured and unstructured data going into our platform.” Sharma added that people are hugely important to the process.
“Another segment we serve is people in the field — warehouse managers, manufacturing managers, and so on. Arnekt’s natural language processing is important to help us translate the data intelligence we gather into language the field force can understand and then use to issue marching orders to their teams.” Supply chain tracking — hot and cold Roambee’s focus is on helping reduce shipping losses and improving time to invoice. The company said its tracking capabilities extend far beyond the needs of individual enterprise customers, as it’s also playing a role in the global COVID-19 vaccine distribution effort.
The logistical intelligence Roambee gathers doesn’t just provide the location of assets across various transport modes — Honeycomb-based services include spoilage monitoring for food shipments, damage monitoring, trailer and container security monitoring, and pharmaceutical cold chain monitoring.
Roambee provides pharmaceutical cold chain monitoring to “one of the largest U.S.-headquartered, global COVID-19 vaccine makers,” the company said.
Vaccine shipments using dry ice need to be completed within two days. That puts pressure on the supply chain, Roambee marketing VP Scott Hurley told BioPharm International recently.
“Without effective real-time monitoring, that is all but impossible,” he said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,692 | 2,019 |
"All the privacy issues Apple didn’t talk about at its annual event, and why they matter | VentureBeat"
|
"https://venturebeat.com/2019/09/17/all-the-privacy-issues-apple-didnt-talk-about-at-its-annual-event-and-why-they-matter"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest All the privacy issues Apple didn’t talk about at its annual event, and why they matter Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
A watch that actually tells time, all the time.
Portrait mode for pets.
Slofies.
Apple’s annual shiny-new-toys event last week was packed with glitzy news from Apple as usual. These features are fun, but something’s been troubling me for the last few days since those announcements: something that was almost completely ignored for the entire 90 minutes.
Apple, which regularly touts its user privacy and security credentials, was surprisingly silent about this topic during its event. The only time privacy came up was in relation to using data from Apple Watch to support health research. That seems incongruous for a company that ran a giant billboard ad proclaiming, “ What happens on your iPhone, stays on your iPhone ,” particularly given that consumers value personal data protection more than ever amidst the seemingly never-ending stream of leaks, hacks, and missteps.
Here are some important things I wish Tim Cook and his team had said: “The new U1 chip in the new iPhone offers granular location tracking, and here’s how we’ll prevent that from being misused.” Has Apple learned any lessons since it last introduced a micro-location technology, iBeacon ? It was a disaster for privacy, quickly adopted by companies wanting to track our every move, potentially down to the inch. Apple’s own website explains to developers how they can use beacons to target consumers in stores based on which section of the store they’re in.
So what are we to think of the new U1 chip , which Apple describes as “more precise” ? Or the new ability it enables for others to identify your device’s name (which is often set to your actual name) simply by pointing their U1-equipped device at you? Will Apple, retailers, and advertisers be able to tap into these new trackers to collect more data from you, more quickly, and more accurately? Greater location tracking precision means greater privacy and security risks, and it’s a shame Apple didn’t proactively address them.
“We are going to have a lot of data about how you consume content, and here’s exactly what we are going to do with it.” Apple used to make money primarily by selling hardware to consumers, but product sales have been dipping. Meanwhile, its services (apps, music, videos, photos, etc.) are an increasingly lucrative business and now account for one-third of the company’s gross profits. To keep consumers hooked on its newest services like Apple Arcade and Apple TV+ , the company will no doubt want to analyze our behavioral data. To demonstrate it truly cares about consumer privacy, it should have been proactive in answering key questions: What sort of data will it be collecting? How exactly will that happen? Who will have access to this data? Do consumers have the ability to opt-out? “We will treat security vulnerabilities with the gravity they deserve and be honest with our users about the risks they face.” Google security researchers recently revealed that vulnerabilities in iOS were exploited to hack thousands of iPhones in an attack understood to be targeted at the persecuted Uyghur minority. Security researchers responded with shock at the chillingly broad scale of the attack, which went undetected for years. Apple responded by downplaying the impact and criticizing how Google described it.
These were exploits that enabled deep system access to hackers, providing access to nearly all personal information on the compromised iPhone. And since that includes authentication tokens, it could also unlock access to the user’s accounts independent of the compromised device, indefinitely. Simply put, this was the most consequential iPhone hack ever. So for Apple’s executives to stand on stage just days after the disclosure and wax lyrical about new phones without a peep about how the company will improve its security posture felt like a dereliction of duty.
Apple’s announcement wasn’t just a bid for our wallets but also for our data: Use an iPhone to capture all your personal moments, make an iPad your primary computing device, watch all your entertainment through Apple TV and Apple TV+, and let Apple Watch track and analyze your health stats. So its silence on privacy and security during its most high-profile event was glaring.
If the company wants consumers to trust it with some of their most sensitive data, it needs to earn that trust through words and actions and live up to its stated belief that privacy is a fundamental human right.
Harold Li is vice president of ExpressVPN.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,693 | 2,020 |
"Why Apple’s anti-tracking move hurts everyone … but Apple | VentureBeat"
|
"https://venturebeat.com/2020/09/12/why-apples-anti-tracking-move-hurts-everyone-but-apple"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Why Apple’s anti-tracking move hurts everyone … but Apple Share on Facebook Share on X Share on LinkedIn Apple store seen in Hong Kong. (Photo by Budrul Chukrut/SOPA Images/LightRocket via Getty Images) Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Apple recently announced a new privacy feature that will ask iPhone and iPad users to opt in or opt out of tracking for in-app advertising. While most applaud Apple for its pro-privacy stance, there’s much more to the story. As I’ll explain below, Apple’s move will hurt publishers and consumers for its own financial gain. The truth is that Apple’s virtue-signaling is masking anti-competitive behavior that needs to be called out.
The first domino Apple announced in June that iOS14 (due later this month) would prompt users to opt in out of tracking by advertisers in third-party apps on their iPhones and iPads. It’s not hard to see why most expect users to opt-out en masse. How ominous is this warning? When a user selects “Ask App Not to Track,” it disables an anonymous identifier known as the ID for Advertisers (IDFA). Once the IDFA is disabled, app developers and publishers can no longer make that identifier available to advertisers seeking to deliver relevant ads to users. While it seems rather innocuous, it will set off a chain of events that will end badly for everyone but Apple and Google.
Harm to publishers and developers Articles have covered how this will hurt advertisers.
While few will take pity on advertisers, what about your favorite news, weather, music, fitness, gaming, or meditation app? Disabling the IDFA will devastate ad-supported apps because it’s the IDFA that makes their media valuable to advertisers. If you’re a luxury apparel brand for women, you’re targeting a very narrow set of users, and you’re willing to pay more to reach them. In this example, apps that serve ads to affluent females (anonymously identified by their IDFA) can charge a 2-3x premium for that ad. Without IDFAs to target ads to relevant audiences, prices will plummet by 50-70% , making ad-supported models untenable.
Of the 2.2 million apps in the Apple store, many will fail as ad revenue nosedives. Apps that are able to migrate to subscription models will pay a high price. Aside from the costly development work and the inevitable loss of users, publishers will have to pay Apple a 30% tax on new subscription revenue. This is where Apple crosses the line into monopolistic behavior – more on that below.
Harm to consumers When ad supported content is no longer viable, consumers will have to pay for content. While very few say they like ads, most realize we need them. A recent NAI study found that 75% of consumers are aware that free content is enabled by advertising. Moreover, 64% of consumers believe online content should be free. So we expect free ad-supported content, but we don’t want to share data that makes ad models work? Actually the problem isn’t advertisers. The NAI study also found that the #1 privacy concern is data collection by hackers, not publishers. Guess who else knows this and stands to benefit from the death of the free content? Well … you know the answer.
So to recap: Apple knows that disabling IDFAs will kill ad models and force publishers to migrate to subscriptions for which Apple will collect 30%. Apple also knows this will require us to pay for content (such as Apple News+ at $9.99/month) that we fundamentally expect for free. Are you getting the picture yet? When Steve Jobs introduced the iPhone in 2007, he proclaimed that this was “the Internet in your pocket,” and a transformational step for the growth of online publishing. I don’t recall his desiring to be a 21st century railroad baron. The decapitation of the IDFA threatens the viability of the open mobile web while imperiling the very ecosystem that made Apple’s devices so magical in the first place. Is this really the future Jobs envisioned in 2007? I’m a longtime fan, customer, and shareholder of Apple. I admire Tim Cook and the company’s ethos. But we have to call this out for what it is. If Apple simply wants to offer more privacy protection, there are less subversive ways to do it. Intentional or not, Apple is using privacy as an excuse to stifle competition and harm consumers for its own benefit.
What else can we expect? For starters, Google is likely to follow suit — quickly. Once Apple has transitioned apps to fee-generating subscription models, Google will be right behind it, with 2.8 million apps in the Google Play store.
We can also expect more lawsuits. Following Epic Games’ lawsuit against Apple, Google, and Samsung, more and more apps will file suit for anti-competitive practices.
Lastly, the government will have to intervene. While the FTC and DOJ have been very accommodating to date, the domino effect will require a federal response. For reference, the FTC prohibits conduct by a single firm that unreasonably restrains competition by creating or maintaining monopoly power. Specific examples might include: Exclusive supply or purchase agreements: iPhone/iPad app distribution and/or downloading can only occur via the App store. Violate terms and find yourself persona non grata across 800 million devices.
Tying arrangements: Prohibiting mobile commerce outside the App ecosystem in which publishers must share revenue with Apple.
Lack of alternatives: I f you want to reach iPhones, you must pass through the App store. It’s the same for consumers seeking Apps.
Predatory pricing: Is 30% reasonable? Go ask Fortnite.
Stay of execution Last week, Apple announced it will postpone implementation of the anti-tracking feature until early 2021. It cited the fact that ad-supported developers and publishers were not yet prepared (quite the understatement). While this stay is helpful, it only delays the inevitable — unless we act.
Call to action Now more than ever, the advertising, publishing, and developer communities must start working together on two critical fronts: 1. Communicate the exchange of value when we consume free ad-supported content.
Each time I visit a page or open an app, I should be informed that anonymous tracking enables the publisher to generate ad revenue to provide free content. By allowing the site to track me, I’m supporting their business. If I choose not to opt in for tracking by (pick a date), I will have to subscribe to view the content. I should then be directed to a page that clearly and succinctly states what data is being tracked (e.g. anonymous or personal) and how it will and will not be used. Explain the tradeoffs and allow me to make an informed decision. It’s common sense.
2. Lobby Apple to change the language in its opt-in/opt-out prompt.
As currently written, users are going to opt out of tracking in droves. But what if it read something like this, instead? This app relies on an anonymous identifier to provide relevant advertising that supports free content. You may opt out now or in the future. Learn more: visit PalAbout/Privacy.
I believe Apple should prioritize helping users make informed decisions rather than scaring them down the path to paid content.
I’m hopeful that Apple will act in good faith and work with the industry to balance privacy with the interests of consumers, publishers, and advertisers. But it will not happen on its own. As they say, speak now or forever hold your peace. Because once this occurs, the damage will be hard to undo.
Steve Latham is Global Head of Analytics at Flashtalking , a global data and advertising firm.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,694 | 2,020 |
"Apple’s 'privacy' changes undermine international advertising best practices | VentureBeat"
|
"https://venturebeat.com/2020/09/29/apples-privacy-changes-undermine-international-advertising-best-practices"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Apple’s ‘privacy’ changes undermine international advertising best practices Share on Facebook Share on X Share on LinkedIn Apple store seen in Hong Kong. (Photo by Budrul Chukrut/SOPA Images/LightRocket via Getty Images) Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Apple’s decision to require users to opt-in to its IDFA tracking has understandably disrupted the ad tech ecosystem.
Its new measures, albeit now delayed until “early” 2021, ignore current initiatives from the IAB around transparency and privacy-first best practices.
The IAB-led initiatives I’m referring to include ads.txt, app-ads.txt, sellers.json, the GDPR Transparency and Consent Framework (TCF), and the Open Measurement SDK. Each of these solutions was created to help standardize marketing practices around the world. And in doing so, the IAB managed to help simplify digital advertising processes while making them more open and transparent to all parties.
Privacy first, transparency second The new approach Apple plans to roll out as part of iOS 14 will fragment those worldwide practices and vastly reduce transparency. The IAB Europe recently urged Apple to consider adhering to its TCF standards in order to promote interoperability as opposed to shutting vendors out. The TCF was designed to ensure compliance with European privacy laws when processing personal data or accessing and/or storing information on a user’s device. Unfortunately, Apple took a different approach on privacy with its decision to essentially deprecate the IDFA.
In its July statement, the IAB Tech Lab explained that Apple’s plans regarding iOS 14 conflict with the TCF standards. For example, on Apple devices, users can opt-in or opt-out of services such as geolocation data on the operating system (OS) level. But if the user chooses to do so, app publishers will not be notified. As a result, apps would still be showing an opt-in request pop-up and annoying the user, while being unable to signal the user’s choice to its vendors.
On the other hand, if a user is using an app that meets TCF standards but does not opt-in to ad tracking, the publisher will not be able to synchronize the user’s choice with Apple’s OS. Therefore, iOS isn’t able to register the user’s choice on the system level.
Both scenarios regarding iOS 14 hinder user-centric transparency. The ideal solution would be for Apple to join global privacy standards, like the IAB’s, rather than develop its own proprietary methods. But let’s dig deeper.
Comparing App-ads.txt and SKAdNetwork/Info.plist App-ads.txt is just one of several measures the IAB has taken to reduce ad fraud in the industry and promote more transparency — but it’s also the most applicable when comparing its goals to that of Apple’s IDFA update (i.e., SKAdNetwork). With app-ads.txt, publishers maintain a text file on their developer URL, which lists all authorized vendors of their inventory. This information is readily available to anyone who wants to access it. And in doing so, brands and agencies can ensure that their marketing dollars only go to authorized and reputable vendors.
Apple’s SKAdNetwork, on the other hand, requires publishers to enter the registered IDs of each of their vendors (i.e., ad networks) into the Info.plist file within the app’s configuration data in the App Store. Now you might be thinking, so where is the lack of transparency from Apple? Well, the problem is that only Apple is able to view the ad network partners listed.
The two concepts, app-ads.txt and Info.plist, share similar features, but when it comes to real transparency they are far from the same. Here’s a more detailed breakdown: App-ads.txt SkAdNetwork and Info.plist Initiated by IAB Tech Lab Apple Launched March 13, 2019, for mobile and OTT, with wide adoption across the industry March 29, 2018 (iOS 11.3 update), but only a handful of industry insiders even paid attention Purpose To enable buyers to distribute programmatic spend only through channels that are explicitly trusted and authorized by the originating publisher and combat ad fraud through illegitimate inventory arbitrage Info.plist is part of SKAdNetwork. It is a property list file in a publisher’s app that contains the app’s configuration data in the App Store Implementation for Publishers Publish authorized sellers and monetization platforms in plain text on the developer URL, including the domain name of the advertiser, the seller account ID, the type of relationship, and certification authority ID Update Info.plist and include a purpose string in the system prompt NSUserTrackingUsageDescription with a custom message describing why you’d like to track a user Add the authorized ad network ID to the Info.plist by updating the SKAdNetworkItems key with the additional dictionary You can find an open list of SKAdNetwork IDs here Strengths Unified/standardized open solution that can be crawled by all tech vendors IAB Tech Lab offers access to an aggregation of ads.txt files published around the internet (for IAB members only) According to Pixalate , apps that lack an app-ads.txt file have 13% more invalid traffic versus apps that have one in place Fewer insights into data stream from third-party vendors means less likelihood that users can be triangulated and individually identified Apple confirms the authenticity of all apps, so the likelihood of ad fraud is reduced Weaknesses Apple and Roku do not support the IAB standard for crawling for app-ads.txt yet. For Apple, authorized sellers need to be identified via a public search API (BidSwitch, 42matters, and Apptopia offer public mapping of developer URLs as well) Implementing app-ads.txt is not mandatory, so adoption was slow (at least at the beginning) Only allows five parameters to be transmitted Data will be exchanged between Apple and ad networks directly — neither the app nor any other third party will be able to collect, verify, or act on the data Even though the implementation of these two solutions is relatively simple, the implications are very different. It seems like Apple is implementing new measures in the name of privacy while simultaneously building new walls around its user data. This, in turn, is undermining a mobile advertising ecosystem that is trying to keep apps free for end users. Meanwhile, the IAB has been working with partners across the industry to champion solutions for greater transparency.
To be more straightforward, Apple’s dramatic changes in the name of privacy conflict with more sensible transparency moves already underway by the IAB. While Apple first introduced SKAdNetwork and Info.plist in March 2018, only a handful of industry insiders even batted an eye at the time. But now with the future of the IDFA in limbo, Apple’s SKAdNetwork and Info.plist may very well be the future.
While everyone agrees that privacy-first approaches represent the next phase of digital advertising, there are many paths to achieving this goal for users. It’s time for all parties to take the extra time Apple has granted us in order to come together with user experience in mind. Let’s resolve the conflicts and start building an open, transparent, and privacy-centric future within the digital advertising ecosystem.
Ionut Ciobotaru is Chief Product Officer at Verve Group.
He founded mobile monetization platform PubNative and has 15+ years of experience in the ad tech industry. He previously held leading roles at Applift, Weebo, and EA.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,695 | 2,021 |
"3 tips for enterprises as Apple's iOS14 privacy features roll out | VentureBeat"
|
"https://venturebeat.com/2021/01/30/3-tips-for-enterprises-as-apples-ios14-privacy-features-roll-out"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest 3 tips for enterprises as Apple’s iOS14 privacy features roll out Share on Facebook Share on X Share on LinkedIn UKRAINE - 2020/10/14: In this photo illustration the iOS 14 logo of the iOS mobile operating system is seen displayed on a mobile phone with an Apple logo in the background. (Photo Illustration by Pavlo Gonchar/SOPA Images/LightRocket via Getty Images) Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Enterprises and app developers, brace yourselves — the iOS 14 upgrade will soon roll out a new data consent window that will appear in all apps that collect and share data with outside parties for advertising purposes. The rollout will have a widespread impact on businesses and will affect the number of iOS devices available for personalized advertising.
Many consumers will view the new consent features as a positive step forward to better privacy protection, which it is.
For developers and enterprises , each consumer’s decision to consent to or refuse “Tracking” will shape the business models of the App Store economy and the wider internet for years to come.
The new consent screens give consumers more control in shaping the Future of the Internet which will ultimately be a net positive. Clarity, transparency, and consumer control are good for iPhone users — and the internet at large. But there are still steps that developers — and enterprises — can take to ensure that they not only comply with Apple’s new rules, but find success in the next era of the privacy-first internet.
Here are three strategic recommendations that can help developers and enterprises adapt to the new privacy normal: 1. Let your users know WHY you need their data — and what benefit they derive from opting into data sharing While the language included in Apple’s mandatory AppTrackingTransparency (ATT) notification cannot be changed, developers can add a message that appears ahead of the ATT consent. This message can include any language the developer chooses (so long as it is accurate and not misleading) and should be utilized as a way to build trust with the user. After all, if the user trusts an app, they’ll be more likely to consent to data-sharing.
When possible, use plain, concise language that will clearly articulate what kind of data is being collected, what it is being used for, and (most importantly) the value exchange – why the user benefits from sharing that data. Perhaps certain app functions are improved by data sharing, or the app is funded through data-sharing, and users would need to pay for downloads if the app can no longer collect data. Regardless of the reason, this primer message is the best opportunity to make your case to your user.
To see if different language affected opt-in rates, Foursquare tested out several versions of our primer messages on our own app users. While it’s still early days, our results showed that a straightforward explanation of the value exchange (“Support City Guide. Your data allows us to provide this app for free to you.”) yielded the highest number of opt-ins. We shouldn’t be surprised that consumers respect when businesses are transparent with them.
2. Shift to an ID-agnostic strategy As mobile advertising IDs (MAIDs — also known as IDFAs) are phased out , enterprises and developers need to embrace a pluralistic future and an interim period of complexity around identity. The Future of the Internet will involve multiple types of identifiers, and it will take time for each company to find the solution that works best for both the business and its users. During this period, developers must be nimble and willing to keep an ID-agnostic approach until they’ve experimented with several different forms of ID, and until we see how the whole market shakes out.
For many, email addresses will emerge as the best form of identity because user consent is clearly established. When users willingly provide their emails while downloading an app or setting up a profile, they authenticate the relationship between themselves and the service. There are other industry solutions being rolled out to further protect consumer privacy that have emails as their foundation, so establishing a logged-in user base today may allow you to leverage those solutions as they gain prominence and adoption.
3. Plan for the short and long term to avoid product interruptions The future is likely going to look more contextual and probabilistic, and less deterministic. This may sound daunting to many enterprises that have been doing marketing the same way for a long time. Enterprises must plan for a future in which scale is in shorter supply and accessing device-level identity may be more challenging. Apple’s changes are not the final chapter in this story. As the next step, expect Android to follow with changes to the availability of Google advertising IDs (AAIDs) in late 2021 or early 2022.
To adapt for the long-term, double down today on investments on data science, or find partners who are already doing so. For example, some enterprises are experimenting with cohort-based ad delivery and measurement. Plan to keep adding scale and incorporating new types of data — such as transaction data — that will help fill in the gaps left by the loss of MAIDs. It’s also important to have a holistic strategy across first-, second-, and third-party data. When you leverage second- and third-party data, being strategic means vetting your partners to be sure they are adhering to the same privacy principles as your company because your reputations will be linked.
Exactly what the Future of the Internet will look like is still a mystery, but there’s no reason for developers or enterprises to move forward blindly. By taking the above steps and, perhaps most importantly, committing to being flexible, you won’t just be “riding out” the impending changes but will actually be adapting both your business and the ecosystem to a more sustainable — and privacy-sensitive — place.
Tyler Finn is the Director of Data Strategy at Foursquare , where he focuses on the future of privacy and identity. Prior to the merger with Foursquare, he led global privacy and policy initiatives for Factual. Earlier in his career, he worked on public policy in the unmanned aerial vehicles space.
If you’re an expert in data tech or strategy and have an important story to share, contact the VentureBeat guest post team.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,696 | 2,021 |
"Apple, Facebook, and 2 different visions of the internet | VentureBeat"
|
"https://venturebeat.com/2021/04/15/apple-facebook-and-two-different-visions-of-the-internet"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Apple, Facebook, and 2 different visions of the internet Share on Facebook Share on X Share on LinkedIn Presented by AdColony When 98.5% of your business is based on advertising and a genuine threat comes from another business that isn’t a direct competitor of yours, you’d probably consider that a crisis. In response to this crisis, Facebook took out a full-page ad in the New York Times.
It’s not just Facebook that made noise and tried to out-engineer Apple’s guidelines. A coalition of major Chinese developers attempted to fingerprint devices via the CAID (Chinese Advertising Identifier), but Apple responded with a fast and detailed slap down of the attempt.
Facebook’s full-page ad and larger PR campaign was more subtle than the CAID, but it’s a move that has been seen as desperate, particularly because of the primary argument that they were standing up to Apple “for small businesses everywhere,” when, in fact, the biggest impact will be their own bottom line. For them, nearly billions of dollars is at stake.
And it’s all because of one company and their decision to give consumers a choice. Apple claims that users should know when their data is being collected and shared across other apps and websites, and they should have the choice to allow that or not.
“App Tracking Transparency in iOS 14 does not require Facebook to change its approach to tracking users and creating targeted advertising, it simply requires that they give users a choice,” Apple said in a statement.
What Apple is doing here is calling out the fact that, for lack of federal government regulation that protects consumers from what they believe to be a violation of privacy, Apple will go ahead and do it for them using the sheer power and ubiquity of its own platform.
But is Apple’s move really about protecting consumers? Sure, by touting “Privacy. That’s iPhone,” in a massive ad campaign and hoisting it up on a pedestal like they would a new piece of hardware, it can feel like privacy is their product.
Apple has been using privacy as a differentiating factor in its market positioning for the past decade.
Now, as part of that, they are hawking consumer choice. But ultimately what this comes down to is that Apple has a different vision of the future of the internet.
For Apple, their vision is of a clean, curated web where content — at least the content that they are responsible for distributing — is from trusted sources, high-quality and is primarily paid for up-front or through subscriptions, not through advertising.
This isn’t new news. Anyone who follows Apple could see this coming. In 2015, Apple Music became a subscription, then we saw streaming video (Apple TV+) and gaming ( Arcade ) added to the mix. And now, of course, there’s the bundle option of Apple One.
So it’s not surprising that analysts believe this is the road they are heading down. The fast pace of technical innovation means consumers want to own the latest and greatest, and subscriptions offer flexibility to upgrade at a lower upfront cost. Additionally, Millennials and Gen Z tend to have a rent versus buy mentality, which applies not just to cars and homes but music and video streaming.
It’s safe to say that Apple stands alone when it comes to their vision of the internet It’s not just about philosophy, of course. Apple can say that they believe in privacy, a clean internet where you pay for premium content via subscriptions. But what it comes down to is they sell hardware, not software.
Facebook and Google, on the other hand, are software companies. So of course they believe that the internet — and everything that lives on and around it, including content in mobile apps — should be free. For them, advertising is the “ ultimate tax ” you pay to access content.
And, while you previously paid tax solely with your attention, it’s now paid with data. Thanks to documentaries like The Social Dilemma, as well as the massive increase in malware/spyware on the internet and cybersecurity hacks, consumers are becoming more aware of how deep that cost really is.
In many ways, we are going back to the early days of the web where context was king and media was valued when it came from a trustworthy source, but in that world consumers need to pay up…with money.
So — when Apple asks you if you want to be tracked across apps and websites, what they are really asking you is “How do you want to pay for your content?” We’re already seeing verticalization from ad networks and MMPs in an effort to combine information under the umbrella of “first party data” so as to not qualify their behavior as tracking under Apple’s App Tracking Transparency (ATT) framework. The other question is whether the CAID will see widespread enough adoption.
Alasdair Pressney is Director of Product Strategy – Advertiser Products at AdColony.
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected].
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,697 | 2,017 |
"4 best practices to move the needle on digital advertising | VentureBeat"
|
"https://venturebeat.com/2017/02/22/4-best-practices-to-move-the-needle-on-digital-advertising"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored 4 best practices to move the needle on digital advertising Share on Facebook Share on X Share on LinkedIn Presented by Undertone Depending on whose statistics you believe, the average American sees between 3,000 and 5,000 branding messages every single day. And yet, if you asked most of us, we’d be hard-pressed to recall more than one or two, if we’re even able to remember the ads we’ve seen at all. This problem is especially bad in the digital space, where many consumers have learned to completely shut out the banners, pop-ups, and autoplay video ads that interrupt the content they set out to enjoy.
It’s for these reasons that I’ve spent the past several years as one of the advertising technology industry’s most vocal proponents of developing user-first ad experiences that people actually want to spend time with.
The only problem is that creating an exciting, engaging campaign is easier said than done. Doing so requires hard work, planning, and a willingness to really put yourself in your customer’s shoes. But if you’re able to be strategic and disciplined in executing your game plan, achieving this goal is far from impossible. Here are four keys to delivering a truly memorable ad experience: 1. Bring your entire team into the planning process Last month, I had the pleasure of participating in an informative webinar with Carla Meyer, a veteran marketer who serves as the global digital advertising and social media manager for Garmin International. According to her, one of the biggest reasons the GPS brand has been so successful in the digital sphere has been its decision to break down the silos that have historically existed between a company’s data science, planning, buying, creative, and technology teams.
Access the free VB Live event with Undertone’s Eric Franchi and Garmin’s Carla Meyer right here.
Instead of this unwieldy division of labor, Garmin keeps everyone on the same page by developing a campaign’s media and creative strategies simultaneously. By bringing each department to the table at the very beginning, the brand can plan coordinated, cross-screen campaigns where every touchpoint is matched to an appropriate creative unit.
2. Don’t worry about clicks — focus on emotion Because digital gives us so much data to look at, we often forget to ask ourselves a simple, important question: “How did my ad make the consumer feel?” After all, research has shown that the vast majority of purchase decisions are based on emotion and intuition rather than reason.
Generally speaking, the more people feel, the more they buy. More specifically, visceral reactions like surprise, sadness, and fear can spike short-term activation behaviors like engagement and conversions. Meanwhile, long-term branding is best achieved by making the consumer feel good by the end of the ad experience. What’s great about digital is that a single campaign can help you accomplish both of these goals at the same time.
3. Measure how your ads make people feel Contrary to what you may believe, it is possible to measure the emotional responses people have to your ads. For a campaign we executed with Garmin, we worked with the advertising research firm BrainJuicer to find out what consumers were feeling as they watched a dynamic, interactive ad for the Vivosmart fitness tracker.
The measurement process surveyed 150 consumers who were interested in fitness trackers, asking them to comment on their emotions as they watched the ad. Inspired by sleek visuals and a captivating tagline (“Beat yesterday.”), the viewers registered happiness and surprise as the ad demonstrated all the different activities the device could be used for. By running these studies during the testing phase, marketers can predict the success of their ads and tweak their creative units to prompt the right emotions.
4. Be mobile-first and device-specific One of the biggest mistakes I see people make in digital is trying to retrofit their television branding assets into digital channels. As Carla pointed during the webinar, the attention spans of today’s consumers are much shorter than they used to be, and a 30-second commercial just isn’t going to hold someone’s attention when they can simply X out of the window and do something else. In order to be successful, marketers must develop experiences tailored to the environments where people will see them.
For instance, a video experience designed specifically for mobile could be shot vertically to account for the ways people naturally hold their phones. In addition, brands can make their ads even stickier by taking advantage of the interactive features consumers can only enjoy on their mobile devices. When Gatorade rolled out a sponsored Snapchat filter that allowed users to drench themselves in a bucket of virtual electrolytes during last year’s Super Bowl, the ad didn’t just take off because it was a clever idea. Rather, a large part of the filter’s appeal was that it was completely different from the ads people see when they consume traditional media.
Moving forward, marketers need to devote additional resources to developing this kind of outside-the-box, mobile-first ad experience. As more and more people adopt their smartphones as their number one media consumption hub, it’s no longer feasible for brands to spend all their time and money making TV commercials.
All that’s left for them to do now is to get to work.
Eric Franchi is co-founder and SVP Business Development at Undertone.
Sponsored posts are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected].
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,698 | 2,017 |
"The digital advertising strategy no one's told you about: Blockchain | VentureBeat"
|
"https://venturebeat.com/2017/04/09/the-digital-adverstising-strategy-no-ones-told-you-about-blockchain"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest The digital advertising strategy no one’s told you about: Blockchain Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
There’s already one large company out there that’s saved $1 million dollars doing it.
The “it,” in this case? Using a blockchain-based solution for advertising.
To be clear, we’re in the very early stages of this decentralized revolution, in general, and for marketing, in particular.
However, from the way the industry is shaping up, the first technology/market in the cross-hairs of this wave of disruption is ad tech.
The pain in advertising today Recently, I had a great conversation with Stacy Huggins, CMO of MadHive (which focuses on OTT advertising). She tells me that for every $1 an advertiser spends on digital, they get about $.44 of value.
Why? Because of all the middlemen/intermediaries that have become part of the digital ad ecosystem.
MadHive has a plan to help publishers get some of that value back. And, in this solid video about the Basic Attention Token , we learn from Brendan Eich (who hasn’t done much in his life aside from creating Java Script and Firefox) that we are literally paying nearly $23 per month per person to view ads. (BTW, get the Brave browser) Oh, by the way, that payment comes with approximately 70 trackers that follow all of our movements.
Brendan thinks we should get paid to watch ads, not the other way around.
Where there’s friction, there’s opportunity Bottom line, there’s a ton of friction and waste in the digital ad eco-system because of the number of intermediaries.
(If you want a tremendous overview on the existential threat to agencies, see Ben Thompson’s great piece, Ad Agencies and Accountability.
“) And what up and coming technology is inherently designed to remove intermediaries, friction, and cost? Yep, you guessed it: blockchains.
That’s why companies like adChain , NYIAX , and HubDSP are all jumping into the fray. And there will be more.
Still early … Like I said, it’s very early, and there are certainly barriers to adoption.
Ad tech is a market with a ton of entrenched interests and money at stake. Expect the “violent opposition” of Schopenhauer’s Stage The technology is still new and, let’s be honest, not quite proven.
Even using digital tokens adds another layer of complexity for many publishers. Most people are simply not familiar with the concept yet. That will take time.
… But inevitable The blockchain genie isn’t going back in the bottle.
Whether it’s one of these companies or the next wave that follows them, the efficiency and effectiveness of the solutions make it a foregone conclusion.
Throw in the additional benefits to the brand (i.e. potential to ensure your ads aren’t shown next to controversial content), and you have the makings of a pretty compelling value proposition.
Here we go. … Jeremy Epstein is CEO of Never Stop Marketing and currently works with startups in the blockchain and decentralization space, including OB1/OpenBazaar, Internet of People, & Storj. He advises F2000 organizations on the implications of blockchain technology. Previously, he was VP of marketing at Sprinklr from Series A to “unicorn” status.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,699 | 2,020 |
"New Pokémon Snap planned for Nintendo Switch | VentureBeat"
|
"https://venturebeat.com/2020/06/17/new-pokemon-snap-planned-for-nintendo-switch"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages New Pokémon Snap planned for Nintendo Switch Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Surprise! Pokémon Snap is back, Jack! It only took a bit over 20 years.
The Pokémon Company announced New Pokémon Snap today for Switch. Like the original, it tasks players with taking pictures of Pokémon in the wild as they travel via an on-rails vehicle. You get more points for taking better pictures of the critters. Bandai Namco is developing the sequel. The studio has experience with the franchise, having worked on the Pokkén Tournament fighting game.
The first Pokémon Snap came out for Nintendo 64 back in 1999. It was one of the first major spin-offs for the franchise, giving fans a way to enjoy the series outside of the traditional RPG mechanics.
The announcement did not reveal a release date for the project, simply saying that the game is “under construction.” GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,700 | 2,021 |
"Nintendo finally adds online play to a huge Switch hit | VentureBeat"
|
"https://venturebeat.com/2021/04/27/super-mario-party-online-play"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Nintendo finally adds online play to a huge Switch hit Share on Facebook Share on X Share on LinkedIn Super Mario Party is a massive hit for Nintendo Switch.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Nintendo is finally updating Super Mario Party with online play. This comes — well, not just in time for stay-at-home orders due to the pandemic (which started more than a year ago). But it’s a welcome addition regardless as many people countries await the vaccine. Social distancing is still an everyday reality, and online gaming can help people stay connected with friends and family.
Keep the party going! A free update to #SuperMarioParty adds online play to the board game mode, 70 minigames , and the 2 vs 2 Partner Party mode. Available now! Additional details: https://t.co/MdWIZ47w4i pic.twitter.com/U285o371hZ — Nintendo of America (@NintendoAmerica) April 27, 2021 This update was a long time coming, and you can read the patch notes here.
Super Mario Party is one of the best-selling games on the current generation of consoles. It has sold 13.82 million copies worldwide as of the end of 2020, according to Nintendo.
That puts it ahead of Splatoon 2, and it puts it on par with some of the biggest releases from Sony or Electronic Arts.
Of course, Super Mario Party sells to a different audience than something like Ghost of Tsushima, which has sold 6.5 million copies as of March. Ghost of Tsushima sells a lot of copies quickly to a hardcore audience that cannot wait to play it. That’s opposed to Super Mario Party, which sells steadily throughout the generation to new families picking up a Switch for the first time.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! That family audience is something that Nintendo dominates, but it is also guilty of taking families for granted. Super Mario Party launched 2.5 years ago, but despite its success, Nintendo never released an update. At least, not until now.
With this patch, Nintendo could begin building loyalty even among casual players. Because eventually, the pandemic will end, and Nintendo should do more to ensure families don’t forget about the Switch when other activities return.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,701 | 2,021 |
"Epic Games launches early access for MetaHuman Creator tool to make realistic animated people | VentureBeat"
|
"https://venturebeat.com/2021/04/14/epic-games-launches-early-access-for-metahuman-creator-tool-to-make-realistic-animated-people"
|
"Game Development View All Programming OS and Hosting Platforms Metaverse View All Virtual Environments and Technologies VR Headsets and Gadgets Virtual Reality Games Gaming Hardware View All Chipsets & Processing Units Headsets & Controllers Gaming PCs and Displays Consoles Gaming Business View All Game Publishing Game Monetization Mergers and Acquisitions Games Releases and Special Events Gaming Workplace Latest Games & Reviews View All PC/Console Games Mobile Games Gaming Events Game Culture Epic Games launches early access for MetaHuman Creator tool to make realistic animated people Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Epic Games is launching early access for its MetaHuman Creator tool, which enables people to create realistic animated human characters within minutes.
Epic Games unveiled the tool in February, and it had an overwhelming response from people who tried it and created two MetaHuman sample characters with it. Now, early access users will be able to use it in Unreal Engine, or for further editing in a DCC application like Autodesk’s Maya.
Due to the cloud-based nature of the application, Epic Games said it will add applicants gradually, so it may take a few days before you get your turn. We’ll have a panel on MetaHumans with Paul Doyle and Vladimir Mastilović of Epic Games at our GamesBeat Summit 2021 event on April 28 and April 29. Wanda Meloni of M2 Insights will moderate the session.
For those who just want to get started with using digital humans right away in Unreal Engine, Epic Games is also providing over 50 ready-made MetaHumans for download and use in projects (they’re from Quixel Bridge, another startup acquired by Epic). You have to download the free application and click on the MetaHumans section.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Using MetaHuman Creator, you can manipulate facial features, adjust skin complexion, edit teeth, and select from a preset range of body types, hairstyles, clothing, and so on. When you finish your character, you can export and download it, rigged and ready to animate, in Unreal Engine.
Above: You can use the MetaHuman Creator tool in Unreal Engine.
MetaHuman Creator combines technology from 3Lateral and Cubic Motion — both now part of Epic Games — and makes it accessible. Previously, it was possible to create assets of this caliber only through extremely time-consuming and expensive in-house methods or outsourcing.
MetaHuman assets must be rendered with Unreal Engine. It requires a Windows or MacOS computer with internet access and a Chrome, Edge, Firefox, or Safari web browser. You will also need an Epic Games account. To download your MetaHumans, you will need to install the free Quixel Bridge application.
MetaHumans require Unreal Engine 4.26.2 or later. MetaHuman source assets can be downloaded for animating in Autodesk Maya. However, you cannot publish MetaHumans as final content unless rendered with Unreal Engine.
Epic Games said that MetaHuman Creator and the MetaHuman assets may not be used for the purpose of building or enhancing any database, training or testing any artificial intelligence, machine learning, deep learning, neural networks, or similar technologies.
This version of the MetaHuman Creator tool itself cannot run in a game; it is an external content creation application. The MetaHuman assets it generates are for real-time use in Unreal Engine, but they are definitely at the resource-hungry end of the spectrum.
Epic Games supports Unreal Engine’s Live Link Face iOS app. It is also working with vendors on providing support for ARKit, DI4D, Digital Domain, Dynamixyz, Faceware, JALI, Speech Graphics, and Cubic Motion solutions.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
Games Beat Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,702 | 2,021 |
"Ratchet & Clank: Rift Apart launches June 11 | VentureBeat"
|
"https://venturebeat.com/2021/02/11/ratchet-clank-rift-apart-launches-june-11"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Ratchet & Clank: Rift Apart launches June 11 Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
The launch window is expanding to June.
Sony Interactive Entertainment revealed today that it is launching Ratchet & Clank: Rift Apart on June 11. This is one of the PlayStation 5’s most anticipated games, and fans will have to wait a few more months to get their hands on it.
You can preorder Ratchet & Clank starting today for $70 or you can get the Digital Deluxe edition for $80.
This is a big game for Sony because it has positioned Ratchet & Clank as one of the few that best shows off the power of the PlayStation 5. But the game is also in a tough position. Unlike many other Sony titles, it is coming exclusively to PS5. Sony has emphasized that this game couldn’t run on a PS4 when pitching it to fans. But with PS5 sales limited by dwindling supply — and with no relief in site due to a global microprocessor shortage — Ratchet & Clank can only sell to a few million PS5 owners.
But Sony had previously promised that it would launch Rift Apart during the PS5’s launch window, and it is roughly keeping to that schedule.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,703 | 2,023 |
"Best VPN Service 2023: VPNs Tested by Our Experts - CNET"
|
"https://www.cnet.com/news/best-vpn"
|
"Anyone who accesses the internet from a computer, tablet or smartphone can benefit from using a VPN. You don't have to be an activist, government dissident or journalist to need a VPN; the rise of third-party data brokers, cross-site advertising trackers, IP address collection and mobile geo-targeting have all combined to create an online browsing environment that poses significant threats to everyday users' basic privacy. Because a VPN encrypts your connection, your browsing data is protected from your internet service provider (and any government entities who request your ISP data), and your network administrator in most cases. A VPN can also shield your private information -- like passwords, usernames and bank or shopping details -- from anyone snooping on your network.
"_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/protonvpn-review/\">Proton VPN's free tier is the only free VPN we've come across so far that's worth using. It costs a lot of money to operate a VPN, and free VPN services usually make up for the lack of subscription revenue by selling user data. And in addition to being limited in usability and light on security, many free VPNs are fronts for malware distribution, which is why it's generally best to avoid them. However, Proton VPN's unlimited free tier is fast, secure and can be used for most online activities, including streaming Netflix. But if you're on a budget and want access to a premium VPN solution, you can also take a look at our picks for the "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/best-cheap-vpn/\">best cheap VPNs.
A mobile VPN is simply a VPN you can use on your mobile device like your iPhone or Android phone. All of the providers we recommend have mobile versions of their desktop clients. You can use a mobile-focused VPN app to ensure greater data privacy designed for your whole device. Mobile VPNs also generally have a smaller memory footprint, and require less processing power than desktop VPNs, so they tend to yield faster connection speeds and don't eat up your battery as quickly. Keep in mind, however, that most mobile VPN clients will use a lighter form of encryption than a desktop client to achieve those smartphone speeds. So be sure to check your VPN apps' settings to ensure you're using the apps' strongest encryption if your privacy needs are heightened. Our top three VPN picks all have excellent, easy-to-use mobile VPN app options for their services. Some VPNs will only work with one type of mobile platform -- like iOS or Android -- and some are universally compatible. To find the right mobile VPN for you, check out our mobile-specific VPN guides below. We routinely update them with our retesting information, so check back often.
"_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/you-really-should-be-using-a-vpn-on-your-phone-heres-how-to-set-it-up-in-under-10-minutes/\">How to Set Up a VPN on Your Smartphone "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/best-android-vpn/\">Best Android VPNs for 2023 "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/best-iphone-vpn/\">Best iPhone VPNs of 2023 VPNs are perfectly legal to use in most countries. There's nothing wrong with taking steps to protect your privacy online, and you shouldn't have to worry that using a VPN as part of that process will get you in any kind of legal trouble.
However, there are countries where VPNs are either banned or outright illegal. If you're using a VPN in a country like China, Iran, Oman, Russia, Turkmenistan, UAE or Belarus, you may find yourself in legal trouble. The irony here is that these are the countries where internet censorship and surveillance are most common. In those countries, you'll need to make sure you use a VPN that provides "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/all-the-vpn-terms-you-need-to-know/\">strong obfuscation so your VPN traffic is disguised as ordinary HTTPS traffic, meaning government entities won't even know you're using a VPN in the first place.
But you won't run into any trouble with the law for using a VPN across most of the world. One important reminder, though: VPNs are legal in most places, but engaging in illegal activity online is still illegal regardless of whether you're using a VPN.
If you live in a country that censors its media or are traveling to one, "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/what-is-geo-blocking-and-how-you-can-bypass-it/\">georestricted content is a pain. You can use a VPN to circumvent censorship or access your home country's normal media content for an "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/best-streaming-service-of-2022/\">online streaming service like Netflix, Hulu, Amazon Prime Video or Disney Plus. Pick a VPN that lets you manually select which country you want to connect through and has something called "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/all-the-vpn-terms-you-need-to-know/\">obfuscation.
(Our top three picks offer this.) You don't always need to use the obfuscation feature to unblock Netflix, but since streaming services actively try to block VPN connections, obfuscation can help because it disguises your VPN traffic as regular internet traffic.
If you're looking to try out other VPNs, choose one with a large number of IP addresses, preferably 10,000 or more. This is because one of the ways Netflix and others block VPNs is by blacklisting known VPN IPs -- and if your VPN has tens of thousands of IPs, there's a better chance that you'll be able to connect to an IP address that Netflix hasn't flagged.
Once you have your VPN installed, connect to the country whose content you wish to view, restart your browser and go to the streaming site. If your VPN is working, the site should treat you as a resident of your selected country and serve you content assigned to that audience. If you're still having trouble, you can try using incognito mode on your browser or try clearing your cookies and cache.
Your first and most apparent indication that your VPN is working is that your IP address will change and your location will be registered as that of the VPN server you're connecting through. You can check this on a site like "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.whatismyipaddress.com\">whatismyipaddress.com.
You'll also want to make sure your VPN is protecting your privacy and not leaking any of your data outside of the VPN tunnel, thus exposing it to your ISP and other entities that may be monitoring your online activity. You can check for leaks by going to a site like dnsleaktest.com or ipleak.net. If your location is being registered as the VPN server location, and your leak tests turn up negative, then you know your VPN is working to protect your privacy.
A remote-access VPN uses public infrastructure like the internet to provide remote users secure access to their network. This is particularly important for organizations and their corporate networks. It's crucial when employees connect to a public hotspot and use the internet for sending work-related emails. A VPN client on the user's computer or mobile device connects to a VPN gateway on the company's network. This gateway will typically require the device to authenticate its identity. It will then create a network link back to the device that allows it to reach internal network resources such as file servers, printers and intranets, as if it were on the same local network.
This is when the VPN technology uses a gateway device to connect the entire network in one location to a network in another location. The majority of site-to-site VPNs that connect over the internet use IPsec. IPsec-based encryption protocols are often considered by VPN specialists to be less secure against modern surveillance. Rather than using the public internet, it is also normal to use multiprotocol label switching clouds as the main transport for site-to-site VPNs.
VPNs are often defined between specific computers, and in most cases, they are servers in separate data centers. However, new hybrid-access situations have now transformed the VPN gateway in the cloud, typically with a secure link from the cloud service provider into the internal network.
The best VPN for you depends on your needs when using a VPN.
VPNs for crucial privacy and security If you're a journalist, a lawyer or a professional in any other privacy-sensitive field, forget about speed and price when choosing a VPN. Focus, instead, entirely on security. Your VPN may be somewhat slower but, for both VPNs and presidential motorcades, speed is always the trade-off for privacy. Avoid free VPNs and "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/proxy-vs-vpn-a-browser-based-proxy-isnt-a-vpn-but-you-might-want-one-anyway/\">browser-based VPNs.
If you're concerned with government monitoring in your current country, choose a VPN headquartered outside of the country you're currently in, and avoid choosing a VPN with a jurisdiction in an allied country. For example, US journalists should avoid VPNs with a jurisdiction in "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/are-us-based-vpns-trustworthy-heres-why-i-dont-recommend-them/\">the US or other Five Eyes countries.
Keep an eye on encryption: Your VPN should offer a protocol called OpenVPN TCP (for its mobile apps, IKEv2 is fine). Right now, the VPN we recommend most for critical privacy is "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/expressvpn-review/\">ExpressVPN.
VPNs for working from home If you're working from home, you may be sharing your internet connection with multiple devices and family members or roommates. That's a lot of simultaneous connections to a VPN and a lot of drag on a network. Pick a VPN that lets you use one subscription on as many devices as possible and has excellent speeds so your Wi-Fi isn't bogged down. If your job involves handling sensitive information like financial or medical records, however, your priority VPN criteria is security. Our top three VPN picks are the most secure we've found, and each has a different number of connections they'll allow for a base-level subscription. Depending on your budget and home office requirements, ExpressVPN, Surfshark and NordVPN are all great options for working from home. There are a few other factors worth considering for a home office VPN, though, so check out our guide to "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/how-to-pick-the-right-vpn-now-that-youre-working-from-home/\">picking the right VPN for working at home.
VPNs for gaming Most VPNs are chosen based on having a good balance of speed, security and cost. But if you want a VPN specifically to connect to game servers in another country, speed is everything. Free VPNs won't be fast enough, but, fortunately, high-end security won't be a cost driver, which gives you more "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/best-cheap-vpn/\">options at modest prices.
Since "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/how-to-speed-up-your-vpn-connection/\">all VPNs reduce speed -- many by half or more -- that means picking one from the set that "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/fastest-vpn/\">performed best in our speed tests.
In our latest tests, NordVPN took the lead as the fastest VPN, though you can get excellent speeds through Surfshark via the WireGuard protocol as well as with ExpressVPN. If you're focused on VPNs for game consoles, have a look at "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/best-vpn-for-xbox/\">our best VPNs for Xbox and "_blank\" rel=\"noopener noreferrer nofollow\" href=\"https://www.cnet.com/tech/services-and-software/how-to-set-up-a-vpn-on-xbox/\">our primer on installing them.
Before choosing the one right for your needs, visit the VPN's official website to see whether they offer servers specifically aimed at gaming in the countries where you most want to connect to other players.
Tech Money Home Wellness Home Internet Energy Deals Sleep Price Finder more Tech Services & Software Best VPN Service 2023: VPNs Tested by Our Experts The best VPNs for private streaming, gaming and torrenting, rated by our expert staff.
Updated Nov. 6, 2023 4:00 a.m. PT Written by Attila Tomaschek Russell Holly Rae Hodge Attila Tomaschek Expertise Attila has nearly a decade's worth of experience with VPNs and has been covering them for CNET since 2021. As CNET's VPN expert, Attila rigorously tests VPNs and offers readers advice on how they can use the technology to protect their privacy online and Russell Holly Expertise 7 years experience as a smartphone reviewer and analyst, 5 years experience as a competitive cyclist Credentials Author of Taking your Android Tablets to the Max Rae Hodge Former senior editor David Gewirtz Why You Can Trust CNET 1600 1700 1800 1900 2000 2100 2200 2300 2400 2500 Speed Tests 11 12 13 14 15 16 17 18 19 20 Eval. Points 01 02 03 04 05 06 07 08 09 10 VPNs Tested We intensively test each VPN, making sure it meets our standards for privacy, speed and usability.
How we test VPNs Editors' choice 2023 Show less Best Overall VPN ExpressVPN Privacy protection and fast speed See Price at ExpressVPN Savings 49% off with 12-mo plan (+3 free months) Pros Unmatched transparency Top-notch security with no leaks detected Excellent for streaming Cons Expensive Only five simultaneous connections Owned by Kape Technologies Price $13 a month, $60 for six months or $100 for a year Latest tests No leaks detected, 18% speed loss in 2023 tests Network 3,000-plus servers in 160 locations across 94 countries Jurisdiction British Virgin Islands Full Review ExpressVPN is one of the fastest VPNs we've tested, registering an 18% speed loss in our latest tests in February and March of 2023. Its apps for iOS and Android are designed with a streamlined approach aimed at connecting fast without fuss. A single button on its landing screen directs you to connect quickly, with the only accompanying option a drop-down server location selector with your fastest nearby city selected by default.
ExpressVPN's other options -- its security and privacy tools, account and settings options, and support page -- are all kept neatly tucked away under a garden-variety three-bar icon in the screen's top left corner. And they're worth checking into. ExpressVPN has included an on-board IP address checker, along with two leak testers and a password generator.
In the past year, ExpressVPN increased its independent third-party audit count, published details about its TrustedServer deployment process, joined the i2Coalition to call for improved VPN industry ethics, and released the open-source Lightway encryption protocol.
All of our top-rated VPNs have wide compatibility across platforms and operating systems, but ExpressVPN's collection of setup guides, detailed FAQs and troubleshooting articles give it a clear advantage for users. So does its 24/7 customer support, and its no-questions-asked, 30-day money back guarantee.
The company has been in business since 2009, and ExpressVPN has a substantial network of more than 3,000 RAM-only servers spread across 160 locations in 94 countries. ExpressVPN's best plan offers five simultaneous connections for $100 a year (which includes three extra months, for a limited-time deal totaling 15 months of service). You can also opt for a $13 per-month plan, or pay $60 for six months.
Show expert take Show less Show less Best Cheap VPN Surfshark Extensive features at a great price See Price at Surfshark Savings $2.23/mo with 24-mo plan (+3 free months) Pros Lots of unique security features Unlimited simultaneous connections RAM-only server network Cons Inconsistent speed performance 14 Eyes jurisdiction (Netherlands) No transparency reports Price $48 for the first year (then $60 annually) or $13 a month. Two-year plans are $60 for the first two years combined (then $60 annually) Latest tests No leaks detected, 40% speed loss in 2023 tests Network 3,200-plus servers in 100 countries Jurisdiction Netherlands Full Review Surfshark boasts an impressive suite of privacy and security features, unlimited simultaneous connections, easy-to-use interface and expansive global network of more than 3,200 servers in 100 countries. And it's still significantly cheaper than most of its competitors. That's what helped Surfshark earn CNET's Editors' Choice for Best Value VPN in 2022.
Along with standard VPN features such as a kill switch and DNS leak protection , some of the more notable Surfshark features include camouflage mode (which hides the fact you're using a VPN), split-tunneling, NoBorders mode (which lets you use Surfshark in regions where VPNs are restricted) and multihop VPN connections. You'll also get access to Surfshark's CleanWeb technology, which blocks ads and malware and helps you avoid phishing attacks.
One innovation we're excited to see Surfshark roll out over the next year is its Nexus network, which connects the VPN's entire network of servers together and allows you to choose multiple servers to route your connection through. The functionality is somewhat similar to Tor , but Surfshark says it's faster. With its Dynamic MultiHop, IP Randomizer and IP Rotator functions, the Nexus network can give you a few extra layers of protection while you use the VPN -- which can be particularly beneficial to users with critical privacy needs.
Surfshark says it doesn't log any user activity. And although no-logging claims are virtually impossible to prove with 100% certainty, German cybersecurity firm Cure53 declared Surfshark's security to be "solid" in its 2021 security audit of the VPN. Surfshark passed its first independent no-logs audit in January.
Since February 2022, Surfshark and NordVPN have had the same corporate parent , but Surfshark said it is legally bound not to share any information between the entities that would go against its privacy policy or terms of service.
We didn't find any language in either document that would indicate Surfshark has any obligation to share user data with its parent company or any sibling companies, which include NordVPN.
Surfshark used to consistently rate as one of the fastest VPNs available, but we've been disappointed with its inconsistent speeds recently. The 40% speed loss we measured in February and March of 2023 was more than double the 19% speed loss we measured in 2022. Though speeds through WireGuard were impressive, registering only an 8% speed loss, the 76% speed loss we registered through OpenVPN dragged Surfshark's overall speed rating down considerably. The company tells us that it is working on resolving the OpenVPN speed issues.
In our tests, Surfshark had no problems unblocking Netflix and Amazon Prime Video content, but we did run into a fair bit of trouble accessing Disney Plus. After testing various servers in the US and other countries where Disney Plus is available, we were finally able to access the content when we connected to a server in Boston. You may need to test a few servers yourself before gaining access to Disney Plus content with Surfshark.
Surfshark offers cheaper introductory prices that jump after the first billing cycle. Even so, Surfshark manages to keep its prices lower than most other VPNs -- helping it earn CNET's Editors' Choice for Best Value. The yearly plan starts out at $48 for the first year, then jumps to $60 for any additional years of service. If you opt for the two-year plan, you'll pay $53 up front for the initial two years combined, plus two free months, then $60 per year for any additional years. Surfshark's monthly plan stays constant at $13 a month. If you're not satisfied with the service for any reason, Surfshark offers a 30-day money-back guarantee.
Show expert take Show less Show less Best Connectivity NordVPN Reliable VPN with multi-device use See Price at NordVPN Pros Among the fastest VPNs Tons of features Diskless RAM-only server infrastructure Cons No transparency reports Ambiguous corporate structure Only six simultaneous connections allowed Price $79 for the first two years or $60 for the first year (then $100 per year afterwards) or $12 a month Latest tests No leaks detected, 10% speed loss in 2023 tests Network 5,600-plus servers in 84 locations across 59 countries Jurisdiction Panama Full Review NordVPN is one of the most recognized brands in the VPN field. It offers a generous simultaneous connection count, with six simultaneous connections through its network, where nearly all other providers offer five or fewer. NordVPN also offers a dedicated IP option, for those looking for a different level of VPN connection, and the ability to VPN into Tor. More than half of Nord's 5,000-plus server fleet is optimized for peer-to-peer sharing, though Nord has blocked torrenting in 14 countries.
In our latest test rounds we noticed a few hiccups in Nord's killswitch when using its iOS app, which could be a concern for torrenters. However, Nord has a sideloaded iOS app available on its website that it recommends for users. In our most recent speed tests , NordVPN's overall speed performance and consistency helped it climb to the top of our list of fastest VPNs. In February and March of 2023, we lost only 10% of base internet speeds through NordVPN's servers.
NordVPN doesn't accept PayPal payments, but you can purchase a subscription with any major credit or debit card, AmazonPay, Google Pay or ACH transfer. If you'd rather pay anonymously, you can pay with a variety of cryptocurrencies, including bitcoin, ethereum, tether and dogecoin. NordVPN has also partnered with a handful of retail stores like Staples, BestBuy and Walmart where you can even purchase your VPN with cash.
Show expert take Show less Show less Open-Source VPN ProtonVPN The only free plan we recommend See Price at ProtonVPN Savings 50% off with 24-mo plan Pros Highly transparent Open-source Unlimited free plan Cons No live chat support Split tunneling only available on Android and Windows Occasional speed dips Savings 50% off with 24-mo plan Latest tests No leaks detected, 9% speed loss in 2020 tests Network 1,700-plus servers in 91 locations across 64 countries Jurisdiction Switzerland Full Review Proton VPN is a solid choice for VPN power users and anyone with critical security needs , but it's also excellent for casual VPN users who are simply looking to give their online privacy a boost or access geographically restricted content. It's fast, easy to use across all platforms and can unblock streaming services like Netflix , Disney Plus, HBO Max and Amazon Prime Video.
Proton VPN hasn't been around nearly as long as some of its peers like ExpressVPN and NordVPN, but in a few short years, it has earned a sturdy reputation for security and transparency. Much of that reputation was built on the back of Proton Mail's already established strength as a secure email solution, but Proton VPN has become a solid product on its own merit since it launched in 2017.
All of its apps across platforms are fully open-source, making Proton VPN the only provider in our top five to have its software's source code publicly available for anyone to scrutinize. The apps are also routinely audited by third-party cybersecurity professionals who confirmed that "no important security issues were identified" during their latest audit.
Proton VPN has all the standard security features you'd expect from any VPN provider worth its salt, including a kill switch , DNS leak protection and AES 256-bit encryption. The provider also offers additional security protections like an ad/malware blocker, Tor over VPN and a stealth protocol to help cloak your VPN connection and bypass firewalls.
But the pièce de résistance of Proton VPN's security suite is its fleet of Secure Core servers. Essentially, these servers operate in the same way as other VPN providers' multi-hop functionality does, but Proton's Secure Core servers are wholly owned by the company, equipped with hard disk encryption and housed in secure data centers in a defunct military base in Iceland and in underground bunkers in Switzerland and Sweden. Route your traffic through Proton's Secure Core servers first to add a robust layer of physical and technical protection before exiting through another VPN server in a different country.
And if you're looking for a free VPN , look no further than Proton VPN, because its unlimited free tier is truly impressive, and really the only free VPN we've encountered that's worth using. It lacks support for torrenting and doesn't include all the bells and whistles as the paid tiers, but Proton VPN's free tier is secure and doesn't put limits on speed, data or usage time like most other free VPNs do. Free users get access to servers in three countries (US, NL and JP) and can connect one device at a time.
Proton VPN's paid plans cost $72 per year or $10 per month and include access to servers in 67 countries and support for 10 simultaneous connections. Paid plans also include a 30-day, money-back guarantee. Read our Proton VPN review.
Show expert take Show less Show less Best Cheap Alternative PIA Budget-friendly and transparent See Price at PIA Savings $40 per year or $12 per month (3-year plans available at $79 every three years) Pros Open-source and transparent Budget-friendly Unlimited simultaneous connections RAM-only server architecture Cons Buggy features US jurisdiction Trouble accessing streaming content Latest Tests No leaks detected, 24% speed loss in 2023 tests Network 35,000 servers in 91 countries Jurisdiction United States Price $40 per year or $12 per month (3-year plans available at $79 every three years) Private Internet Access, often abbreviated as PIA, is a budget-friendly VPN that's one of the best in the business when it comes to transparency. Yes, concerns associated with its US jurisdiction may be a dealbreaker for VPN users with critical online privacy needs.
However, the company seems to be doing everything in its power to offset those concerns and appeal to privacy-conscious users. PIA's no-logs policy was independently audited in 2022, and it releases a semiannual transparency report and has had its no-logs claims tested in the wild on more than one occasion. All of PIA's software is also fully open source, meaning that its source code is publicly available for anyone to inspect, which is still a relatively rare degree of transparency for a VPN. Proton VPN is the only other open-source VPN service among CNET's top picks.
PIA offers industry-standard AES 256-bit encryption along with DNS leak protection and a kill switch.
It also offers an advanced kill switch feature that won't let you connect to the internet at all unless you're connecting through the VPN -- a great option for critical situations where you don't want to risk going online without first connecting to the VPN. Additional privacy features include multihop connections, obfuscation and an ad tracker and malware blocker feature called MACE. PIA's RAM-only server architecture also helps ensure user privacy because data is theoretically never stored to a hard disk and is completely wiped whenever the server is turned off or rebooted.
In terms of privacy and transparency, PIA is nearly on the same level as ExpressVPN. While ExpressVPN isn't open source and doesn't publish transparency reports in the traditional sense, it has an extensive Trust Center that thoroughly dives into all of its transparency initiatives, including each of the 12 independent security audits it completed in 2022. PIA told CNET that it's working on expanding its transparency reporting efforts, so hopefully we'll start seeing a regular cadence with PIA's third-party audit reports in the future.
If you're looking for a speedy VPN, PIA does fairly well. It's not the fastest VPN we've tested, but its 24% speed loss is still respectable and good for third place behind NordVPN (10% speed loss) and ExpressVPN (18% speed loss). Considering that most VPNs will cut your regular internet speeds by 50% or more, PIA's speeds trend toward the faster end of the spectrum and are plenty fast enough for just about any online activity.
While PIA does most things well, it struggles in a few areas, particularly related to streaming. We had trouble unblocking certain streaming services on both desktop and through its Amazon Fire TV Stick app. OpenVPN connections through the Fire TV Stick app never successfully unblocked content during our testing (although WireGuard connections were successful). And PIA's split tunneling feature may conflict with certain services on your Windows machine and fail to work properly. So if you're looking to PIA to unblock all of your streaming objectives, you may be left disappointed. Other top VPNs like ExpressVPN, Surfshark and NordVPN performed far better during CNET's streaming tests, especially on Amazon's Fire TV Stick.
However, if you're on a strict budget, but need a privacy-focused VPN that can capably handle most of what you'd want to accomplish online, then PIA is still a solid choice. It's one of the cheapest VPNs , available at $40 a year, $12 a month or $79 every three years. That makes it the most budget-friendly VPN in CNET's list of the best VPNs -- even cheaper than Surfshark, CNET's Editors' Choice best value VPN. While PIA doesn't quite stack up to Surfshark in terms of features and performance, it's a good alternative for people who are looking for a low-cost VPN. Each PIA subscription plan comes with a 30-day money-back guarantee, in case you decide within 30 days that the service isn't for you.
Show expert take Show less Show less Best Beginner VPN IPVanish Simple, newbie-friendly interface See Price at IPVanish Savings 66% off with 12-mo plan Pros Unlimited simultaneous connections Simple, user-friendly interface 24/7 customer support with live chat and phone support Cons IPVanish identified during DNS leak tests US jurisdiction Buggy features with platform limitations Price $12 a month or $54 for the first year (then $90 annually) Latest tests No leaks detected, 26% speed loss in 2023 tests Network 2,000-plus servers in 75-plus locations across 52 countries Jurisdiction United States Full Review A big win for IPVanish is its fun, configurable interface, which makes it an ideal client for those who are interested in learning how to understand what a VPN does under the hood.
If you're looking for the ability to do some precision tuning to your VPN connection, IPVanish is a solid bet. With a bevy of switches controlling things like the kill switch, split tunneling, VPN protocol and LAN connection allowance, IPVanish is an app for the methodical tech tweaker who enjoys having exact control over their mobile internet traffic.
IPVanish has long been geared toward peer-to-peer traffic and is a solid choice for torrenters who are looking for a VPN that comes with a SOCKS5 proxy.
With its newly redesigned apps for Windows and Android , IPVanish manages to pack the same extensive suite of digital knobs and dials into a refreshingly clean mobile interface to impressive effect.
Its multiplatform flexibility is also ideal for people focused on finding a Netflix-friendly VPN.
The 26% speed loss we measured through IPVanish in our most recent speed tests is a significant improvement over the 58% speed loss we measured in 2022. However, we noticed that IPVanish's Quick Connect feature still doesn't always connect you to the best available server, so if you want to optimize your speeds, you may need to connect manually to a server showing a lighter load by selecting the Locations option in the app.
Charging $11 for its monthly plan , IPVanish is trying to move you toward its yearly program, which costs $48 for the first year, but then jumps to $90 for subsequent years of service. The provider offers a 30-day money-back guarantee, but only if you purchase the yearly plan -- which could be a disappointment to anyone who purchased a monthly subscription and decided they didn't like it.
That said, the company gets kudos for allowing unlimited simultaneous connections. We also liked its connection kill-switch feature, a must for anyone serious about protecting their privacy while surfing.
Show expert take Show less What is a VPN? A VPN, or virtual private network, is an online service that provides a mobile app, desktop app or other software that encrypts your internet traffic to help boost your privacy online. A VPN also prevents your internet service provider from tracking which websites or apps you're using and stops most of those websites and apps from seeing your actual geographic location, allowing you to bypass content blocks in some countries to access critical news and educational information, while also opening up your streaming entertainment options. The best VPN delivers a strong level of privacy protection without compromising on performance. We strongly recommend using a good VPN for everyday use as well as for work, particularly if your work involves handling sensitive information.
At CNET, we rigorously test each virtual private network across major platforms to find the ones that provide exceptional privacy, reliability, speed and value. This list is constantly being updated as we actively test VPNs and look at the latest research, so expect this guide to change throughout the year as we put each VPN through its paces. This year, we've retested our top picks for connection speed and retested ExpressVPN from the ground up to make sure it still deserved our Editors' Choice as best overall VPN. We also recently published a full review of Private Internet Access , and we're taking a fresh look at the best VPNs for iPhone this month.
What is the best VPN in 2023? ExpressVPN retained CNET Editors' Choice Award for best overall VPN after its 2023 review. It maintains its position among other virtual private network services thanks to its dedication to privacy and strong speeds.
Surfshark is a close second among our picks. In 2022, it also earned a CNET Editors' Choice Award as our VPN value pick, thanks to its low first-year price support for unlimited devices.
NordVPN , our third choice, is a die-hard heavy hitter. It costs more than Surfshark but less than Express, has an enormous network that's constantly getting faster and more secure, and is easily the most reliable service we've tested.
Each VPN service in the list below has excellent value for a specific use case, and we point out the ideal user for each one. The array of options available means there's a VPN service suited to your needs, whether your privacy needs are casual or critical.
Also, consider jumping on one of these VPN deals , which many of our top picks are offering.
":"8l47o0iocakehal","type":"heading"}"="" data-id="toc-dab293a4-bef1-4902-86b6-3a4565bc524d-item-3"> Other VPNs we've tested Not every VPN can be a favorite. These are ones we reviewed, but they're not full-throated recommendations for one reason or another, including limited features and concerns over adequately hiding your identity.
":"4k1zuouu6gcdtbi","type":"heading"}"=""> Hotspot Shield Hotspot Shield VPN's TLS-based Hydra Catapult protocol, US jurisdiction, 128-bit AES encryption support and large percentage of virtual servers might strip away our trust in its ability to provide more privacy protections than its competitors -- but those are all key components to its ability to achieve the blazing speeds it delivered during its most recent speed tests.
It's the second-fastest VPN I've tested , effortlessly delivers smooth-streaming media and can dance between server connections without missing a beat, no matter how many interruptions you throw at it. A 26% speed loss puts it in second place, falling behind Surfshark -- which lost just 16.9% of its speed the last time I tested it -- and knocking ExpressVPN down to third place with a 51.8% speed loss at last measurement. Speed losses on UK connections were under 8%. Gaming, torrenting, browsing, streaming -- these speed-dependent services won't be slowed down for Hotspot Shield users.
We're not excited about Hotspot's privacy and security, though. Since the services uses a closed-source proprietary Catapult Hydra protocol , instead of the more transparent open-source OpenVPN protocol, we'd like to see Hotspot give the public more third-party audits -- a necessary step to bring Hotspot up to speed with routinely audited VPNs like TunnelBear.
As recently as April 2021, review site VPNMentor discovered a DNS leak in Hotspot Shield's plug-in for Google Chrome. Hotspot acknowledged the issue at the time and aimed to improve the product.
We're also not thrilled about the amount of user data Hotspot collects, and its privacy policy. With its premium product, it gathers and retains much more information about users than most other VPNs. And if you're using the free version of its product, it shares that information -- along with even more finite data, including your MAC address and specific phone identifier -- with advertising companies.
While its interface is user-friendly and its speeds are thrilling, spending time with Hotspot is going to leave your wallet a little lighter than you might prefer. Its current price is higher than its nearest competitors, its speeds slightly slower and its privacy more questionable. If you're looking for a VPN purely on the grounds of speed, we still recommend passing on Hotspot until it improves.
Read more : Hotspot Shield VPN Review: This Speedster Costs More Than Faster, More Private Competitors Quick Take Servers: 1,800-plus in 80-plus locations Country/Jurisdiction: US (Five Eyes member) Platforms: Windows, Android, MacOS, iOS, Linux, Amazon Fire TV Price: $8 per month or $95.88 billed annually. Month-to-month plan at $13 ":"is9difrvjbtbr1j","type":"heading"}"=""> TunnelBear TunnelBear has gotten a lot of hype in the last couple of years. But when we looked under its hood and compared it with its VPN competitors, our excitement waned.
TunnelBear's speeds are reasonable. We lost nearly 63% of internet speed overall when we used it, which is about average for a VPN. TunnelBear's speeds have steadily improved over the years as measured by other review and testing sites, though, and the US scores we recorded saw a speed loss of only 54%.
On the plus side, TunnelBear is holding its own in the transparency competition among VPNs by publishing the results of its independent security audits and annual transparency reports.
No IP address, DNS or other potentially user-identifying data leaks were detected during our testing, but in the past TunnelBear was observed to have been leaking WebRTC information. TunnelBear's VPN encryption is standard AES-256 and it supports Perfect Forward Secrecy.
However, it's also a Canadian business owned by US-based McAfee, so if you're looking for subpoena-proof international online privacy, you're playing with fire. It holds a paltry 23 server locations from which you can't manually choose your VPN server or even a city. It doesn't offer Tor-over-VPN, it offers split tunneling only on Android and it can't even unblock Netflix.
On a per-month breakdown, the least expensive TunnelBear plan is its $120, three-year plan. You can also go month to month for $10, or pay $60 up front for a single year. Either way, TunnelBear accepts payment via credit card and bitcoin. Unlike other VPNs, it doesn't take PayPal. Also unlike other VPNs, it doesn't support Amazon Fire Stick or Android TV.
Read more : TunnelBear VPN Review: The Overpriced Ursine Has Trouble Living Up to the Hype Quick Take Average speed loss: 63% Number of countries: 48-plus Jurisdiction: Canada, with US parent company Price: $3.33 per month, or $120, for a 3-year plan ":"cjg33e084086l1o","type":"heading"}"=""> CyberGhost VPN In CNET's previous coverage of virtual private networks, we've praised CyberGhost for its roster of competitive features. Our in-depth review of CyberGhost in 2019 included speed testing, security verification and an analysis of its full suite of privacy tools. Since then, the VPN company has increased its number of servers and is prepared to roll out new privacy tools, all while remaining one of the cheapest VPNs we've reviewed -- at $2.03 per month for a two-year plan.
As we've bolstered our approach to VPN reviews, however, CyberGhost has raised some red flags. Its parent company's history warrants skepticism; our previous tests have shown it to expose your VPN use to your ISP; its website and app trackers are more numerous than warranted; and its ad blocker uses an untrustworthy method of traffic manipulation no VPN should even think about. Its low price previously made it worth considering if you needed to change the appearance of your location online, but not if you wanted best-in-class security.
While CyberGhost's connection speed and security features appear to be improving, we don't currently recommend using the VPN service provider if you're in a country where VPNs are illegal.
We also recommend that anyone in the US review CyberGhost's parent company before deciding whether to pay for a subscription.
On the plus side, however, CyberGhost is still faster than Norton Secure VPN and was less taxing on the processing power of our devices. It also offers split tunneling in its Windows client and has its servers neatly organized into categories: NoSpy servers, servers geared for torrenting, servers best for streaming and servers best for use with a static IP address. CyberGhost imposes no data caps, allows unlimited server switching and offers a 45-day money back guarantee on subscription plans of a year or more.
Read more : CyberGhost VPN review: Competitive Features, but Its Parent Company Concerns Me Quick Take Number of servers: Over 8,000 worldwide in 91 countries Number of server locations: 111 Jurisdiction: Romania, with UK parent company Number of simultaneous connections: 7 $2.03 a month or $60 for a two year plan (plus four free months). Month-to-month plan at $13.
":"ja3iulwmvk9jl1k","type":"heading"}"=""> Norton Secure VPN NortonLifeLock, long known for excellence in security products, has a relatively limited offering in its VPN product.
Norton Secure VPN does not support P2P or BitTorrent, Linux, routers or set-top boxes. Its Netflix and streaming compatibility is somewhat limited. Even worse, during testing, we experienced privacy-compromising data leaks.
During CNET's testing, Norton Secure VPN speeds were comparable to other midtier VPNs but not particularly competitive. Although its VPN is only available on four platforms -- Mac, iOS, Windows and Android -- Norton gets points for its 24/7 live customer service phone support and 60-day money back guarantee.
Norton Secure VPN's pricing structure is a bit different than what you typically find in the industry. Pricing is tiered based on how many simultaneous connections you want with your account. For a single device, you'll pay $30 for the first year and $50 for any subsequent years, or $4.99 a month for the monthly. For five simultaneous connections, the price jumps to $40 for the first year and $80 for subsequent years, or $8 a month for the monthly plan. If you want up to 10 simultaneous connections, the price is $60 for the first year and $100 for subsequent years, or $10 a month for the monthly plan.
Read more : Norton Secure VPN Review: Why We Don't Recommend It Quick Take Number of countries: 30 Number of servers: 1,500 (1,200 virtual) Number of server locations: 200 in 73 cities Country/jurisdiction: US $40 for the first 12 months for five devices ":"5uoqggub0qbep58","type":"heading"}"=""> Mullvad Mullvad is an independent and open source VPN provider that is focused on building trust through transparency and its commitment to protecting the privacy and security of its users. Although there are other VPNs that are considerably more well-known in the industry, Mullvad's offering overall is just as polished and easy to use as many of the bigger players in the market.
Mullvad's primary focus is on security. Like most other top VPN providers , Mullvad employs industry-standard AES 256-bit encryption to secure users' connections. Mullvad's kill switch feature and DNS leak protection are enabled by default and cannot be disabled. During our testing , the kill switch worked as expected and we detected no leaks of any kind. The company says it doesn't keep any logs of its users' activity, and is, for the most part, pretty transparent about how it operates and what it does to protect user privacy. Mullvad is unique in that it doesn't require any personal information at signup. While most VPN providers ask users to provide an email address and enter a username, Mullvad generates a random 16-digit account number to activate each new user account. You don't even need to provide any payment information since Mullvad accepts cash sent via mail.
Mullvad's source code being entirely open source is a testament to the company's transparency, but we'd still like to see Mullvad issue an annual transparency report to give the public a view of how many legal requests the company gets and where they're coming from. Though Mullvad tells us a new security audit is forthcoming, the company's 2020 security audit (conducted by German cybersecurity firm Cure53) concluded at the time that the VPN "does a great job protecting the end user from common PII leaks and privacy-related risks." With servers in 68 locations across 38 countries, Mullvad's VPN server network is comparatively small. Even so, the network covers the most in-demand locations and is pretty well spread out across the globe. And what its network may lack in size, it makes up for in speed. In our latest round of speed testing, we measured just a 23% drop in average speeds (most VPNs will slow you down 50% or more), easily making it one of the fastest VPNs we've tested. Though Mullvad's speeds are fantastic, it's not the best for geographically restricted content. We were able to access Netflix without any issues, but were denied access to stream Disney-plus when connected to Mullvad's US servers.
However, Mullvad's straightforward approach to pricing is a breath of fresh air, especially with so many other VPN providers concocting ever-more convoluted pricing structures. Mullvad costs about $5 a month, whether you want to use it for a month, a year or a decade -- and you're never locked into a long-term subscription plan. If you're not satisfied with the service, you can get a refund within 30 days of purchase.
Read more: Mullvad Review: Solid Security and Privacy, but Swedish Jurisdiction Is Concerning Quick Take Number of servers: 840 Server location: 68 locations in 38 countries Number of simultaneous connections: 5 Jurisdiction: Sweden Price: $5 a month ":"j1jg13dvdtxrn1y","type":"heading"}"="" data-id="toc-dab293a4-bef1-4902-86b6-3a4565bc524d-item-4"> Other VPNs our experts are reviewing Below you'll find some additional VPNs. We're in the process of re-evaluating them in the coming months.
":"m20okpgswagskym","type":"heading"}"=""> PureVPN PureVPN says it doesn't log connection information. The company joined the "no log" movement in 2018, and underwent a third-party audit by Althius IT (albeit one commissioned and paid for by PureVPN).
We like that PureVPN offers a 31-day refund policy and supports bitcoin payments. We also like that PureVPN has both Kodi and Chromebook apps available. In addition, PureVPN was the first VPN service we noted to fully implement GDPR compliance.
Quick Take Number of servers: 6,500-plus Number of countries: 78-plus Country/jurisdiction: Hong Kong $3.24 a month for one-year plan, $1.99 a month for a two-year plan (plus three free months) ":"127l4zfh99nem5d","type":"heading"}"=""> StrongVPN StrongVPN blasts onto our list with excellent infrastructure and a decent price. StrongVPN has a strong no-logging policy, and picks up kudos for its large base of IP addresses. It has a solid collection of servers and worldwide locations. For those of you who need a dedicated IP, you can get one from the company but you'll need to contact customer support to get help setting it up.
One of StrongVPN's strengths is the company's network. It owns and operates its entire network infrastructure, which means it has no externally dictated limits on bandwidth or the type of internet traffic allowed on the network.
StrongVPN's regular monthly price of $10.99 is in the middle of the pack, but its regular yearly price of $80 is among the lowest of our contenders.
Our hands-on testing and review process is designed to cut through that hype. When we look at each VPN service, we're not just examining them for their technical weaknesses, but we're also scrutinizing their individual performance strengths. We want to know what each service does best. We test each VPN across over 20 factors, and we're continually improving our methodology as we learn more.
We test VPNs for browsing and streaming speed in multiple countries as well as their connection stability and even the smallest potential privacy leaks. By testing across multiple devices and platforms, we're able to assess which VPNs are best for gaming versus those best for streaming, torrenting or sharing sensitive information. Most importantly, we focus on doing the deep-dive research necessary to vet each VPN's historical credibility and its ownership in a notoriously murky market.
The VPNs on this list earn our recommendation for more than just boosting their digital privacy strengths -- they enable easy streaming to overcome geoblocked media, have torrenting-friendly servers, and are fast enough to support gaming globally. Based on those continued evaluations, you'll see a few bullet points on each entry in our list, highlighting each VPN's strengths and the uses we recommend it for most. And because we strive to keep on top of a fast-changing market, you'll notice that the rank of each VPN service changes as we learn more and retest.
This table shows the speeds we experienced in our testing. Your speeds will vary depending on factors like your internet service plan and connection type. The percentage of speed lost is intended as a general indicator of how much the VPN slows down your connection -- lower numbers represent a faster overall connection.
Picking a VPN requires knowing two basic things to start with: What you want to use it for, and what you're willing to pay. The range of VPN offerings is vast, but those two things will help you find a VPN that has the right blend of speed, security and cost.
Below, you'll find specific FAQ sections on picking a VPN based on the most common needs: gaming, streaming media, working from home and privacy-critical professions. But in general, you'll want a VPN that provides sufficient encryption, doesn't log your activity, offers essential security features like DNS leak protection and a kill switch, has server locations where you need them and can give you fast connection speeds. Our top five VPNs have all these features, although connection speeds will vary based on your internet provider and the server you connect to.
For a deeper dive, check our detailed walk-through of how we evaluate and review VPNs.
If you're looking for some quick pointers, here are universally applicable advice guides for beginners: Don't use free VPN services : With the exception of Proton, you'll find only paid VPN options on this list above because they're the only ones we can recommend.
Look for a no-logs VPN, but understand the caveats : The best VPNs keep as few logs as possible and make them as anonymous as possible, so there's little data to provide should authorities come knocking. But even "no-logs" VPNs aren't 100% anonymous.
There are limits to the privacy VPNs currently provide to iOS users : Recent independent research has surfaced suggesting iPhones and iPads running iOS 14 or later may be vulnerable to device-only VPN leaks, regardless of which VPN is used. Apple users concerned with potential leaks can take extra precautions by installing their VPN on a home router to ensure their entire Wi-Fi network is encrypted. Some iOS users may potentially reduce the likelihood of leaks while outside of a home network by enabling their VPN's kill switch and selecting OpenVPN protocols. You can also try closing all apps, activating your VPN, and then enabling and disabling Airplane Mode before using your device normally. Apple advises users to activate their device's Always On VPN profile for additional protection.
VPN transparency is important, but warrant canaries are only the beginning : Many services use "warrant canaries" as a way to passively note to the public whether or not they've been subpoenaed by a government entity, as many investigations from national security agencies can't be actively disclosed by law. But -- like the no-logging issue -- warrant canaries aren't always as straightforward as they seem. You should spend more time investigating whether your prospective VPN has cooperated with authorities in the past, and how and when it's disclosed that fact.
Think twice about using a US-based VPN : The Patriot Act is still the law of the land in the US, and that means US-based VPNs have little recourse if and when the feds show up with subpoenas or national security letters in hand demanding access to servers, VPN user accounts or other data. Yes, they may have little data to access if the service has a strong no-logs policy, but why not just choose a service that's based outside Uncle Sam's jurisdiction? (If this is a concern for you, you'll want to avoid countries that the US has intelligence-sharing agreements with, too.) Quick Take Number of servers: 950-plus Number of server locations: 59 locations in 30 countries $3.66 a month (67% discount) for a one-year plan StrongVPN in-depth review and hands-on testing (ZDNet) VPN FAQs In today's hyper-connected world, online privacy and security are increasingly critical. From online banking to communicating remotely with colleagues, we're transferring more data on our computers and smartphones than ever before. Much of that data is confidential information that we need to keep safe from hackers and snoops, so VPN use is on the rise as people take steps to secure their digital lives.
Do I need a VPN? Anyone who accesses the internet from a computer, tablet or smartphone can benefit from using a VPN. You don't have to be an activist, government dissident or journalist to need a VPN; the rise of third-party data brokers, cross-site advertising trackers, IP address collection and mobile geo-targeting have all combined to create an online browsing environment that poses significant threats to everyday users' basic privacy. Because a VPN encrypts your connection, your browsing data is protected from your internet service provider (and any government entities who request your ISP data), and your network administrator in most cases. A VPN can also shield your private information -- like passwords, usernames and bank or shopping details -- from anyone snooping on your network.
What is the best free VPN? Proton VPN's free tier is the only free VPN we've come across so far that's worth using. It costs a lot of money to operate a VPN, and free VPN services usually make up for the lack of subscription revenue by selling user data. And in addition to being limited in usability and light on security, many free VPNs are fronts for malware distribution, which is why it's generally best to avoid them. However, Proton VPN's unlimited free tier is fast, secure and can be used for most online activities, including streaming Netflix. But if you're on a budget and want access to a premium VPN solution, you can also take a look at our picks for the best cheap VPNs.
What is a mobile VPN? A mobile VPN is simply a VPN you can use on your mobile device like your iPhone or Android phone. All of the providers we recommend have mobile versions of their desktop clients. You can use a mobile-focused VPN app to ensure greater data privacy designed for your whole device. Mobile VPNs also generally have a smaller memory footprint, and require less processing power than desktop VPNs, so they tend to yield faster connection speeds and don't eat up your battery as quickly. Keep in mind, however, that most mobile VPN clients will use a lighter form of encryption than a desktop client to achieve those smartphone speeds. So be sure to check your VPN apps' settings to ensure you're using the apps' strongest encryption if your privacy needs are heightened. Our top three VPN picks all have excellent, easy-to-use mobile VPN app options for their services. Some VPNs will only work with one type of mobile platform -- like iOS or Android -- and some are universally compatible. To find the right mobile VPN for you, check out our mobile-specific VPN guides below. We routinely update them with our retesting information, so check back often.
How to Set Up a VPN on Your Smartphone Best Android VPNs for 2023 Best iPhone VPNs of 2023 Are VPNs legal? VPNs are perfectly legal to use in most countries. There's nothing wrong with taking steps to protect your privacy online, and you shouldn't have to worry that using a VPN as part of that process will get you in any kind of legal trouble.
However, there are countries where VPNs are either banned or outright illegal. If you're using a VPN in a country like China, Iran, Oman, Russia, Turkmenistan, UAE or Belarus, you may find yourself in legal trouble. The irony here is that these are the countries where internet censorship and surveillance are most common. In those countries, you'll need to make sure you use a VPN that provides strong obfuscation so your VPN traffic is disguised as ordinary HTTPS traffic, meaning government entities won't even know you're using a VPN in the first place.
But you won't run into any trouble with the law for using a VPN across most of the world. One important reminder, though: VPNs are legal in most places, but engaging in illegal activity online is still illegal regardless of whether you're using a VPN.
How do I use a VPN for Netflix? If you live in a country that censors its media or are traveling to one, georestricted content is a pain. You can use a VPN to circumvent censorship or access your home country's normal media content for an online streaming service like Netflix, Hulu, Amazon Prime Video or Disney Plus. Pick a VPN that lets you manually select which country you want to connect through and has something called obfuscation.
(Our top three picks offer this.) You don't always need to use the obfuscation feature to unblock Netflix, but since streaming services actively try to block VPN connections, obfuscation can help because it disguises your VPN traffic as regular internet traffic.
If you're looking to try out other VPNs, choose one with a large number of IP addresses, preferably 10,000 or more. This is because one of the ways Netflix and others block VPNs is by blacklisting known VPN IPs -- and if your VPN has tens of thousands of IPs, there's a better chance that you'll be able to connect to an IP address that Netflix hasn't flagged.
Once you have your VPN installed, connect to the country whose content you wish to view, restart your browser and go to the streaming site. If your VPN is working, the site should treat you as a resident of your selected country and serve you content assigned to that audience. If you're still having trouble, you can try using incognito mode on your browser or try clearing your cookies and cache.
How do I know if my VPN is working? Your first and most apparent indication that your VPN is working is that your IP address will change and your location will be registered as that of the VPN server you're connecting through. You can check this on a site like whatismyipaddress.com.
You'll also want to make sure your VPN is protecting your privacy and not leaking any of your data outside of the VPN tunnel, thus exposing it to your ISP and other entities that may be monitoring your online activity. You can check for leaks by going to a site like dnsleaktest.com or ipleak.net. If your location is being registered as the VPN server location, and your leak tests turn up negative, then you know your VPN is working to protect your privacy.
What is a remote-access VPN? A remote-access VPN uses public infrastructure like the internet to provide remote users secure access to their network. This is particularly important for organizations and their corporate networks. It's crucial when employees connect to a public hotspot and use the internet for sending work-related emails. A VPN client on the user's computer or mobile device connects to a VPN gateway on the company's network. This gateway will typically require the device to authenticate its identity. It will then create a network link back to the device that allows it to reach internal network resources such as file servers, printers and intranets, as if it were on the same local network.
What is a site-to-site VPN? This is when the VPN technology uses a gateway device to connect the entire network in one location to a network in another location. The majority of site-to-site VPNs that connect over the internet use IPsec. IPsec-based encryption protocols are often considered by VPN specialists to be less secure against modern surveillance. Rather than using the public internet, it is also normal to use multiprotocol label switching clouds as the main transport for site-to-site VPNs.
VPNs are often defined between specific computers, and in most cases, they are servers in separate data centers. However, new hybrid-access situations have now transformed the VPN gateway in the cloud, typically with a secure link from the cloud service provider into the internal network.
What's the best VPN? The best VPN for you depends on your needs when using a VPN.
VPNs for crucial privacy and security If you're a journalist, a lawyer or a professional in any other privacy-sensitive field, forget about speed and price when choosing a VPN. Focus, instead, entirely on security. Your VPN may be somewhat slower but, for both VPNs and presidential motorcades, speed is always the trade-off for privacy. Avoid free VPNs and browser-based VPNs.
If you're concerned with government monitoring in your current country, choose a VPN headquartered outside of the country you're currently in, and avoid choosing a VPN with a jurisdiction in an allied country. For example, US journalists should avoid VPNs with a jurisdiction in the US or other Five Eyes countries.
Keep an eye on encryption: Your VPN should offer a protocol called OpenVPN TCP (for its mobile apps, IKEv2 is fine). Right now, the VPN we recommend most for critical privacy is ExpressVPN.
VPNs for working from home If you're working from home, you may be sharing your internet connection with multiple devices and family members or roommates. That's a lot of simultaneous connections to a VPN and a lot of drag on a network. Pick a VPN that lets you use one subscription on as many devices as possible and has excellent speeds so your Wi-Fi isn't bogged down. If your job involves handling sensitive information like financial or medical records, however, your priority VPN criteria is security. Our top three VPN picks are the most secure we've found, and each has a different number of connections they'll allow for a base-level subscription. Depending on your budget and home office requirements, ExpressVPN, Surfshark and NordVPN are all great options for working from home. There are a few other factors worth considering for a home office VPN, though, so check out our guide to picking the right VPN for working at home.
VPNs for gaming Most VPNs are chosen based on having a good balance of speed, security and cost. But if you want a VPN specifically to connect to game servers in another country, speed is everything. Free VPNs won't be fast enough, but, fortunately, high-end security won't be a cost driver, which gives you more options at modest prices.
Since all VPNs reduce speed -- many by half or more -- that means picking one from the set that performed best in our speed tests.
In our latest tests, NordVPN took the lead as the fastest VPN, though you can get excellent speeds through Surfshark via the WireGuard protocol as well as with ExpressVPN. If you're focused on VPNs for game consoles, have a look at our best VPNs for Xbox and our primer on installing them.
Before choosing the one right for your needs, visit the VPN's official website to see whether they offer servers specifically aimed at gaming in the countries where you most want to connect to other players.
CNET VPN Coverage VPN Use Cases Best VPN Best iPhone VPN Best Free VPN Best Android VPN Best Mac VPN Best Mobile VPN Best VPN for Windows Fastest VPN Best Cheap VPN Best VPN Deals VPN Reviews - Our Top Picks ExpressVPN Surfshark VPN NordVPN Proton VPN IPVanish VPN Reviews - Other Services Hotspot Shield TunnelBear CyberGhost Norton Secure Mullvad VPN Streaming with VPN Best VPN for Smart TV Best VPN for Firestick Setup VPN on Smart TV VPN Travel Hack Streaming TV Insider VPN Education How We Test VPNs VPN FAQs Important VPN Terms VPN and Internet Speed Why Not to Use a Free VPN Critical vs Casual VPN VPN Kill Switch VPN Trackers 3 Crucial VPN Features Setup VPN on iPhone More From CNET Deals Reviews Best Products Gift Guide Shopping Extension Videos Software Downloads About About CNET Newsletters Sitemap Careers Policies Cookie Settings Help Center Licensing Privacy Policy Terms of Use Do Not Sell or Share My Personal Information instagram youtube tiktok facebook twitter flipboard
"
|
15,704 | 2,020 |
"We need to democratize data literacy | VentureBeat"
|
"https://venturebeat.com/2020/08/22/we-need-to-democratize-data-literacy"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest We need to democratize data literacy Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
We’ve all heard the maxim that data is king. Since the early 2000s, the power of data has ballooned unchecked as our economies hurtled towards digitization. Brands and corporations that recognized the opportunity have amassed untold wealth, while legislators are still scrambling to retrofit rules to govern its use.
Yet although the tussles between governments and Big Tech dominate the headlines, we’re overlooking a wider societal shift in which data is playing the starring role. As some market players deepen their understanding and increase their power, those who don’t have a handle on their own data fall further behind. As “traditional” businesses disintegrate and digital tightens its grip, we’re at risk of creating a new hierarchy of power: where the data literates reign over the data illiterates.
Being data illiterate doesn’t mean you don’t have access to data. Few companies these days operate in a data free zone (the mass panic over GDPR is testament to that). Rather, data illiteracy results from a lack of the skills, time or resources needed to properly understand and utilize insights. As data illiterates fall further behind, their economic potential diminishes. For those desperate to catch up, many end up outsourcing their data needs — thereby funneling more power to the already powerful and pushing comprehension of their own data further out of reach.
This cycle is picking up speed. The widening of the gap between those capitalizing on data and those unable to is widening. This divide transcends individuals, companies, and geographies, leaving some at permanent risk of marginalization. And, as is often the way, it’s public sector organizations that are among those most at risk.
Take Britain’s National Health Service (NHS). Responsible for the health of nearly every one of the UK’s 66 million citizens, the organisation has access to unparalleled amounts of data. But they’re not great at using it. This is causing a litany of problems and consequently boosting prospects for corporate giants.
Firstly, underutilization of data prevents the health service from being able to identify problems and innovate effectively to solve them. Silos remain, staffing issues fail to be addressed, patients have disjointed experiences. With each unaddressed issue, the whole institution lags further and further behind the curve.
To cure systemic ills, NHS managers seek outside help. And this can often work out well. Data literates can come in, find and process the data in a way the NHS can’t, and fix things. Companies like IBM, Microsoft , and AWS see healthcare institutions as a massive growth opportunity for this reason. But when public institutions outsource data literacy, not all the third parties getting involved are in it for the right reasons. In many cases, it puts power into the hands of those who understand how to leverage it but have no interest in empowering their clients alongside them. Instead of the involvement of data literates creating an opportunity for the data illiterates to better understand their own insights, reliance on non-collaborative outsourcing grows and the motivation to achieve a level playing field diminishes.
This is also a piecemeal approach. Without an internally-driven push to digitize, customer experience — in this case that of patients and staff — can vary wildly. Despite the NHS offering incredible care, free at the point of use, it can be a frustrating, anachronistic system to navigate. As users lose patience, user-friendly, tech-first services like Babylon, Livi, or Doctorly come along and fill the gap. Data literates offering faster, more accessible services are so far ahead of the game — iterating, investing, expanding — that public sector institutions can’t compete.
We’re currently on the cusp of a new phase of data supremacy and it could go one of two ways: Either data literate challengers will disrupt and overwhelm traditional offerings, or they will become partners to help organizations and individuals better manage, understand, and leverage their own data. If everyone is to benefit from the era of big data, we must push to make the latter a reality.
Driving up data fluency requires investment and education. This upskilling needs to encompass an understanding of what data is, how it’s collected, and how the insights it provides can be leveraged. Embedding this at all levels — from the classroom all the way through to organizations’ training strategies — should become a priority for all businesses and policy makers. Perhaps bank startup loans could be conditional on the completion of a course in data analytics. Students could be offered a short-course Data Literacy certificate. The government could offer free data training to those receiving job-seeker benefits.
To translate this education into a level playing field, we then need to champion tech companies who want to partner with public sector institutions, traditional companies, and individuals to help them take control of their own data. We should fund initiatives that bring them together, enabling expertise to be pooled and showcase how, with a little help, data illiterate organizations can develop their own fluency; combining in-house control and external expertise to create the most favorable outcome for citizens.
Because this isn’t just a public sector issue. Scores of small, independent businesses are being squeezed by competitors with superior data capabilities. Retail behemoths like ASOS and BooHoo built their strategies around data. They use it to find, track, understand, and upsell their market. Every click, eyeball, and abandoned basket is scrutinized, the learnings fed back into a well-oiled digital machine. Independent retailers, with far fewer resources, are too busy keeping their heads above water to get a handle on the myriad of data points they should be leveraging.
The old argument goes that of course these forward-thinking companies are outstripping laggard rivals. That’s capitalism, right? But if we accept this line of reasoning, we risk sleep-walking towards a monopolistic economy — one in which Amazon’s stranglehold sucks oxygen from local businesses the world over, where BooHoo strips the market down to a skeleton, or where Google controls everything from how we learn to what we cook. We need to arm smaller, slower, or less well financed organizations with the tools they need to stay competitive.
We cannot write off data illiteracy as a choice. It’s an economics issue. The poorer farmer doesn’t grow less delicious tomatoes, but she’ll grow fewer of them than her neighbor who had the capital to invest in data tools and made micro-adjustments to her farming practices as a result, improving her yield. From healthcare to agriculture, we need to lay stronger foundations when it comes to data literacy if we are to achieve parity of opportunity. Not all companies will use data as smartly, or benefit from it as much, but they should have access to the same tools. Likewise, the data literates should find new ways to partner with less literate organizations — ways that don’t require the less literate organization to cede all control, and where mutual gain rather than dominance is the objective.
Improving data literacy across society will create stronger, more empowered communities that can compete in our digitized world. Unless we invest in making this happen, we risk handing over the keys to the kingdom to a very small number of very big companies.
Dr. Anas Nader is a doctor in the UK’s National Health Service and co-founder of healthtech platform Patchwork Health.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,705 | 2,015 |
"Zendesk acquires cloud business intelligence startup BIME Analytics for $45M | VentureBeat"
|
"https://venturebeat.com/2015/10/13/zendesk-acquires-cloud-business-intelligence-startup-bime-analytics-for-45m"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Zendesk acquires cloud business intelligence startup BIME Analytics for $45M Share on Facebook Share on X Share on LinkedIn Rachel Delacour, chief executive and cofounder of BIME Analytics Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Publicly traded customer service software company Zendesk today announced that it has acquired We Are Cloud, the parent company of BIME Analytics , a startup with a cloud-based business intelligence (BI) tool.
Zendesk paid $45 million in the deal, according to a press release.
“BIME Analytics will become the core technology powering Zendesk’s customer data platform, enabling Zendesk to further integrate data analytics capabilities across its products,” Zendesk said in the release.
The cloud BI space has been heating up quite a bit lately. Amazon Web Services announced its QuickSight tool last week. Salesforce came out with the Analytics Cloud tool last year. The latter is probably more important here, because Zendesk competes with Salesforce in the service desk software market. Now Zendesk will have BI capability, just like Salesforce.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Meanwhile, other privately held cloud BI providers, like Birst and Domo, have taken on funding rounds in the past several months.
Zendesk will provide more information on the deal on its November 3 earnings call, according to the release.
BIME Analytics started in 2009 and was based in Montpellier, France. Alven Capital invested $4 million in the startup in 2013. BIME Analytics customers included Cars.com, H&R Block, Lenovo, McAfee, Pizza Hut, Shell, and Sodexo.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,706 | 2,019 |
"Vectra: Ransomware attacks are spreading to cloud, datacenter, and enterprise infrastructure | VentureBeat"
|
"https://venturebeat.com/2019/08/07/vectra-ransomware-attacks-are-spreading-to-cloud-data-center-and-enterprise-infrastructure"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Vectra: Ransomware attacks are spreading to cloud, datacenter, and enterprise infrastructure Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Ransomware exploded in 2017, when hackers were breaking into hospital systems and holding patient data ransom in exchange for cryptocurrency. But these types of attacks are now spreading to wider targets, such as cloud, datacenter, and enterprise infrastructure, according to a report by security firm Vectra.
The Vectra 2019 Spotlight Report on Ransomware finds that the most significant ransomware threat — in which hackers steal your data and hold it for ransom — is malicious encryption of shared network files in cloud service providers. San Jose, California-based Vectra released the report ahead of the Black Hat 2019 security conference in Las Vegas this week.
Cybercriminals are targeting organizations that are most likely to pay larger ransoms in order to regain access to files encrypted by ransomware. The costs of downtime due to operational paralysis, inability to recover backed-up data, and reputational damage are particularly catastrophic for organizations that store their data in the cloud.
Above: Some of the key targets of ransomware in Europe and the Middle East.
“The fallout from ransomware attacks against cloud service providers is far more devastating when the business systems of every cloud-hosted customer are encrypted,” said Chris Morales, head of security analytics at Vectra, in a statement. “Today’s targeted ransomware attacks are an efficient, premeditated criminal threat with a rapid close and no middleman.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Ransomware makes for a fast and easy attack with a bigger payout than stealing and selling credit cards or personally identifiable information (PII), both of which have perishable values as time elapses after their theft. Factor in cryptocurrency as the ransom payment — an anonymous, hard-to-trace currency — and it’s easy to see why cybercriminals like ransomware’s clean, no-fuss business model.
The report disclosed that cybercriminals’ most effective weapon in a ransomware attack is the network itself, which enables the malicious encryption of shared files on network servers, especially files stored in infrastructure-as-a-service (IaaS) cloud providers.
Above: Ransomware attacks by industry.
“Fifty-three percent of organizations say they have a ‘problematic shortage’ of cybersecurity skills today, and the ramifications of it are very evident with fast-moving ransomware attacks,” said analyst Jon Oltsik of Enterprise Strategy Group in a statement. “The industry simply doesn’t have enough trained security folks scanning systems, threat hunting, or responding to incidents. This Spotlight Report offers important insights into the weaponization, the shift from opportunistic to targeted attacks, and the industries targeted by ransomware, that can help organizations be better prepared.” Attackers today can easily evade network perimeter security and perform internal reconnaissance to locate and encrypt shared network files. By encrypting files that are accessed by many business applications across the network, attackers more quickly achieve an economy of scale that is far more damaging than encrypting files on individual devices.
Vectra said artificial intelligence can detect subtle indicators of ransomware behaviors and enable organizations to prevent widespread damage. When organizations recognize these malicious behaviors early in the attack lifecycle, they can limit the number of files encrypted by ransomware, stop the attack from propagating, and prevent a disastrous business outage.
The report is based on observations and data from the 2019 Black Hat Edition of the Attacker Behavior Industry Report, which reveals behaviors and trends in networks from a sample of over 350 opt-in Vectra customers. The Attacker Behavior Industry Report provides statistical data on the behaviors motivated attackers use to blend in with existing network traffic behaviors and mask their malicious actions.
Above: Network file encryption attacks are common.
From January to June, the Vectra Cognito threat-detection and response platform monitored enriched metadata collected from network traffic between more than 4 million workloads and devices in customer clouds, datacenters, and enterprise environments. The analysis of this metadata provides a better understanding of attacker behaviors and trends, as well as business risks.
The Ryuk ransomware strain, one of the more successful ransomware strains observed in the past year, sets the ransom according to the victim’s perceived ability to pay. It was first seen in August 2018, and Ryuk has targeted more than 100 U.S. and international businesses, including cloud service providers like DataResolution.net.
The Cognito platform works by accelerating network detection and response, using AI to collect, enrich, and store network metadata with the right context to detect, hunt, and investigate hidden threats in real time. The company says its platform scales efficiently to the largest organizations’ networks, with a distributed architecture using a mix of cloud, virtual, and physical sensors that provide 360-degree visibility across cloud, datacenter, and user and IoT networks, leaving attackers with nowhere to hide.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,707 | 2,020 |
"EvilQuest Mac ransomware impersonates Google, Apple OS processes | VentureBeat"
|
"https://venturebeat.com/2020/06/30/evilquest-mac-ransomware-impersonates-google-apple-os-processes"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages EvilQuest Mac ransomware impersonates Google, Apple OS processes Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Viruses are uncommon enough on Apple’s platforms that users generally don’t worry about them, but security researchers this week discovered a rarity — Mac ransomware that’s both spreading in the wild and potentially dangerous because of the way it hides on an infected machine. Disclosed by Dinest Devadoss , Patrick Wardle , and Malwarebytes’ Thomas Reed , the EvilQuest ransomware appears to be spreading through pirated macOS apps, disguising its background processes as Apple’s CrashReporter or Google Software Update.
Downloaded alongside an app such as the packet sniffer Little Snitch or Mixed in Key 8 DJ software, EvilQuest masks itself first as an innocuous “patch” file within the Mac installer, then renames itself to blend in with system tasks that would be running thanks to macOS or Google’s Chrome browser.
If the ransomware works, it spreads around the computer’s hard drive, then locks infected files behind a demand for $50 within three days, and a threat that the files will remain encrypted.
However, there are questions as to how well EvilQuest actually functions on its own, and what the full extent of its capabilities are. A key logger has been discovered within the ransomware, but the encryption system is still somewhat unknown.
For the time being, it appears that the only way to infect a Mac with EvilQuest is to download certain pirated applications, which provides a simple mechanism to stop the ransomware from spreading: Don’t pirate software. Users who think they might be infected can use Malwarebytes’ Mac app to remove it, and the firm suggests keeping “at least two backup copies of all important data,” one detached from the Mac at all times to avoid attacks on connected drives.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Update on July 7 at 11:00 a.m. Pacific: The researchers have subsequently renamed EvilQuest to ThiefQuest, and now say further examination of ThiefQuest’s code suggests that it’s an exfiltration virus rather than ransomware. According to the researchers, ThiefQuest can transfer a Mac’s files over the internet, as well as logging keystrokes and opening a back door for remote control, but its ransom-related code does not appear to be fully functional. Previously identified tools are still believed to be effective at removing the virus, apparently leaving the Mac undamaged.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,708 | 2,021 |
"Government and industry to combat ransomware with Bitcoin regulation | VentureBeat"
|
"https://venturebeat.com/2021/04/29/government-and-industry-to-combat-ransomware-with-bitcoin-regulation"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Government and industry to combat ransomware with Bitcoin regulation Share on Facebook Share on X Share on LinkedIn Confidential computing increases security for cryptocurrency Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
( Reuters ) — Government and industry officials confronting an epidemic of ransomware, where hackers freeze the computers of a target and demand a payoff, are zeroing in on cryptocurrency regulation as the key to combating the scourge, sources familiar with the work of a public-private task force said.
In a report on Thursday, the panel of experts is expected to call for far more aggressive tracking of bitcoin and other cryptocurrencies. While those have won greater acceptance among investors over the past year, they remain the lifeblood of ransomware operators and other criminals who face little risk of prosecution in much of the world.
Ransomware gangs collected almost $350 million last year, up threefold from 2019, two members of the task force wrote this week. Companies, government agencies, hospitals and school systems are among the victims of ransomware groups, some of which U.S. officials say have friendly relations with nation-states including North Korea and Russia.
“There’s a lot more that can be done to constrain the abuse of these pretty amazing technologies,” said Philip Reiner, chief executive of the Institute for Security and Technology, who led the Ransomware Task Force. He declined to comment on the report before its release.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Just a week ago, the U.S. Department of Justice established a government group on ransomware. Central bank regulators and financial crime investigators worldwide are also debating if and how cryptocurrencies should be regulated.
The new rules proposed by the public-private panel, some of which would need Congressional action, are mostly aimed at piercing the anonymity of cryptocurrency transactions, the sources said. If implemented, they could temper enthusiasm among those who see the cryptocurrencies as a refuge from national monetary policies and government oversight of individuals’ financial activities, having surged past $1 trillion in total capitalization.
The task force included representatives from the FBI and the United States Secret Service as well as major tech and security companies. It will recommend steps such as extending “know-your-customer” regulations to currency exchanges; imposing tougher licensing requirements for those processing cryptocurrency; and extending money-laundering rules to facilities such as kiosks for converting currency.
It also calls for the creation of a special team of experts within the Justice Department to facilitate seizures of cryptocurrency, a process currently fraught with logistical and legal challenges.
Some of the ideas echo those proposed by the Financial Crimes Enforcement Network, which would expand disclosure rules for transactions worth more than $10,000.
Federal investigators said a proposal to register accounts would be especially helpful for identifying drug smugglers, human traffickers and terrorists as well as ransomware groups.
“That would be huge,” said a senior Homeland Security Official, who spoke on condition of anonymity to discuss emerging policy proposals. “This is a world that was created exactly to be anonymous, but at some point, you have to give up something to make sure everyone’s safe.” Governments are already using the blockchain ledger that documents all bitcoin transactions to bring some charges. Last week, authorities arrested a man in Los Angeles and accused him of laundering more than $300 million through a service that combines transactions from multiple cryptocurrency wallets to obscure who is paying whom.
Records from the U.S. Marshals Service show that more than $150 million in crypto assets were seized last year and offered to the public at auction. Last week, the Marshals Service signed a $4.5 million deal with BitGo, a California-based exchange, to hold and sell more forfeited cryptocurrency.
But many of the exchanges, which conduct the critical operation of turning cryptocurrency into dollars or other widely accepted currencies, are in countries outside the reach of U.S. regulators.
The Institute for Security and Technology’s Reiner said that international cooperation will be critical, and that pressure could be brought by allies with similar regulations, which could help push exchanges into countries where Americans will hesitate to send their funds.
“However much crypto markets think they have created their own networks, they still rely on existing financial markets,” Reiner said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,709 | 2,021 |
"Intel signals aggressive market share push in wake of improved Q1 | VentureBeat"
|
"https://venturebeat.com/2021/04/22/intel-signals-aggressive-market-share-push-in-wake-of-improved-q1"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Intel signals aggressive market share push in wake of improved Q1 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Buoyed by strong continuing demand for PC semiconductors, Intel today announced flat non-GAAP revenue of $18.6 billion on a year-over-year basis that exceeded its previous guidance by $1 billion. Overall, Intel reported a net income of $5.7 billion, down 6% year over year.
In addition to increased demand for PC semiconductors, which remain in short supply, Intel reported that the decline in demand for semiconductors used in enterprise servers has reached the bottom. Intel is forecasting increased sales of semiconductors in servers in the second half of the year as the COVID-19 pandemic subsides to the point that IT organizations begin investing in datacenters again. Intel launched 3rd Gen Intel Xeon Scalable processors, code-named Ice Lake, this quarter. And the company is banking on driving server refreshes in the second half of 2021.
In addition, Intel is expecting to see increased demand from cloud service providers that are currently working through the massive amount of inventory they accumulated in 2020.
Regaining dominance over rivals The bulk of the revenue Intel is forecasting will be generated by 10-nanometer class processors in 2021. Intel is increasing its cadence for transitioning to 7-nanometer processors as part of an effort to regain processing power supremacy over rivals, Intel CEO Pat Gelsinger said. This was Gelsinger’s first call with industry analysts since returning to Intel after several years of leading VMware as its CTO and then CEO.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “It’s amazing to be back at Intel, and Intel is back,” Gelsinger said.
Intel also launched an Integrated Device Manufacture (IDM) 2.0 initiative this quarter to address the current processor shortage.
The company is opening foundries to partners that build substrates and other components it depends on to build processors. Additionally, Intel is codesigning processors with cloud service providers. It expects cloud service providers to begin increasing orders from next quarter.
In the meantime, strong demand for notebook PCs, in particular, has enabled Intel to weather the economic downturn brought on by the COVID-19 pandemic, as well as Apple’s decision to abandon Intel in favor of an M1 system-on-chip (SoC) architecture. The new M1 SoC architecture combines ARM CPUs with GPUs and other accelerators to deliver twice as much processing power as an x86 platform.
The push to gain market share As demand for other classes of processors starts to increase, along with PC components, Gelsinger has promised Intel will be very aggressive at the expense of rivals that can’t match its manufacturing muscle. In addition, Gelsinger notes that Intel processors are now optimized for new classes of workloads based on AI models that need to first be trained by processing massive amounts of data and then deployed using inference engines that require maximum processor performance.
It’s not clear to what degree enterprise IT organizations are going to invest in 10-nanometer processor platforms when they know that systems based on next-generation 7-nanometer processors will become increasingly available in the second half of this year. Cloud service providers are also now making greater use of a wide array of processors to run workloads that might previously have been deployed on x86-based servers. Regardless of past missteps, Gelsinger said Intel is now better prepared to fight for control of every processor core being employed.
Overall, Intel is now forecasting $17.8 billion in revenue in the second quarter. This is despite the efforts of rivals such as AMD and Nvidia, which are unable to meet demand for processing horsepower now being driven by everything from gaming sites to digital business transformation initiatives that continue to multiply as the global economy improves.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,710 | 2,021 |
"Taiwan predicts its chip industry will weather global shortage | VentureBeat"
|
"https://venturebeat.com/2021/04/23/taiwan-predicts-its-chip-industry-will-weather-global-shortage"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Taiwan predicts its chip industry will weather global shortage Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
( Reuters ) — Taiwan’s key semiconductor industry has years of growth ahead of it with no worries about oversupply despite a massive capital investment program and only a few competitors in the next decade or so, a senior government minister said on Friday.
Kung Ming-hsin, the head of Taiwan’s economic planning agency, the National Development Council, told Reuters the business opportunities presented by the global transformation to a digital economy were “very, very enormous”.
Kung also sits on the board of Taiwan Semiconductor Manufacturing Co as a representative of the largest shareholder, the government’s National Development Fund, which holds around 6% of the stock of the world’s most valuable semiconductor company.
He said between now and 2025, Taiwan companies have planned more than T$3 trillion ($107 billion) in investment in the semiconductor sector, citing expansion plans from chip giants including TSMC and Powerchip Semiconductor Manufacturing.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Once they are built, Taiwan’s competitors in semiconductors in the next decade will be very few,” Kung said in an interview in his office building, which overlooks the presidential office.
Taiwan’s semiconductor firms are ramping up production to tackle a global chip shortage , which has affected everything from carmakers to consumer products, and meet booming demand following the work-from-home trend during the COVID-19 pandemic.
Soaring demand is set to continue, driven by 5G, artificial intelligence and electric vehicles, Kung said.
“In the next decade or even longer there won’t be oversupply for semiconductors,” he added, when asked if the massive investment plans could have a downside.
Taiwan is currently in the grip of its worst drought in more than half a century, but Kung said the impact on chip firms was limited at present, citing the amount of water they are able to recycle and the location of their main factories in Hsinchu in northern Taiwan, and in the island’s south.
“These two places are okay at the moment. So the impact on semiconductors is not bad.” Still, Taiwan does face other challenges, not least from China where President Xi Jinping has made semiconductors a strategic priority.
Kung named Samsung Electronics as Taiwan’s most serious competitor and also able to match TSMC’s advanced chipmaking, but said U.S. tech restrictions had for now blunted the Chinese threat.
Intel — both a TSMC client and competitor — last month announced a $20 billion plan to expand its advanced chip making capacity.
Kung said there was perhaps room for TSMC to cooperate with Intel, but “what’s important is really how you upgrade yourself”.
To that end, the government is helping the industry develop the next generation of semiconductor manufacturing technology like 1 nanometre and beyond with funding support and talent recruitment programmes in the works, he added.
($1 = 28.1070 Taiwan dollars) VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,711 | 2,021 |
"EU's plan to woo chip manufacturers won't work, Taiwan says | VentureBeat"
|
"https://venturebeat.com/2021/04/28/eus-plan-to-woo-chip-manufacturers-wont-work-taiwan-says"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages EU’s plan to woo chip manufacturers won’t work, Taiwan says Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
( Reuters ) — Taiwan Economy Minister Wang Mei-hua on Wednesday played down the prospect of Taiwan tech firms making advanced semiconductors in the European Union, noting industry leader TSMC has insisted it will focus its most advanced technology on the island.
Wang’s comments come amid overtures from the EU to persuade a global chipmaker to build a major fabrication plant on EU territory that would help the European Commission achieve a strategic goal of securing the most advanced chip production technology over the next decade.
Speaking to reporters, Wang pointed to previous comments by TSMC — Taiwan Semiconductor Manufacturing Company — on its production base.
“TSMC has said repeatedly that the most advanced technology will definitely be (produced) mainly in Taiwan,” she said. “As for how Taiwan and the EU can cooperate, companies have their arrangements and considerations, and it can be further discussed.” The European Commissioner responsible for the EU’s internal market, Thierry Breton, is scheduled to hold discussions with the chief executive of U.S.-based Intel and the president of TSMC’s Europe division, Maria Marced, on Friday.
Sources in Brussels say Breton is keener to reel in TSMC, which is widely regarded as the undisputed industry leader and has a better command of the most advanced manufacturing processes.
TSMC did not immediately respond to a request for comment.
Breton’s push for technology ‘sovereignty’ for the EU comes as a surge in demand for everything from consumer electronics to cars has disrupted global supply chains and exposed the continent’s reliance on chips made in Asia.
Wang added that she had had no direct talks with the EU on the subject “over the last week or two.” “But in our talks with the EU, industrial cooperation has always been an important topic. It’s not just limited to semiconductors, it’s on broader industrial cooperation.” TSMC has said it intends to keep the focus of its production in Taiwan, though it also has major plants in China and announced plans last year to build its own $12 billion factory in the U.S. state of Arizona.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,712 | 2,021 |
"How the global chip shortage could impact your business | VentureBeat"
|
"https://venturebeat.com/2021/04/30/how-the-global-chip-shortage-could-impact-your-business"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest How the global chip shortage could impact your business Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
In its quarterly report this week, Apple cautioned that although it had a terrific quarter, it may not be able to keep up with growing demand for its products given the worldwide shortage of computer chips.
Apple is certainly not alone here – others have called out concerns about the shortage, including Intel, Nvidia, AMD, and Samsung. And it’s not just mobile and consumer devices that could be hit by this shortage – cloud and enterprise data centers could be impacted too, and automakers have already had to reduce production due to chip shortages. The skyrocketing demand of new equipment to enable 5G networks worldwide is also placing pressure on this market. In this “smart” and highly connected world, there is little produced that doesn’t need some embedded computer brains.
Is the shortage real? Some have wondered if this shortage is overblown. My opinion is that the chip shortage is real, but it’s not affecting everyone the same way. The biggest purchasers of chips (e.g., Qualcomm, Nvidia, AMD, Apple, Samsung, etc.) have placed high volume standing orders from the contract fabs primarily in the Far East (the largest being TSMC, but also producers like Global Foundries, Samsung, etc.) and as a result get priority due to the size and long duration production of their purchases (long-duration, high-volume products are where the fabs make their profits). They order products over a long time period and maintain inventory on hand; they don’t just place spot orders when needed as some companies are prone to do (e.g., the automakers who don’t “order to inventory” but instead order for just-in-time manufacturing). Meanwhile, “lower priority” customers who can fill in chip line production gaps in normal times have to wait in an increasingly long queue when they do place new or additional orders, so they don’t get the product they need in what they consider a timely fashion. Even Intel, which makes most of its own chips in house and can therefore more or less control its own manufacturing destiny, has seen an inability to ramp up production fast enough to meet increased market demands, especially in the red-hot PC space. Of course, Intel also buys a substantial number of chips from TSMC and Samsung for several of its product offerings, so it is affected by the outsourced fabs’ capacity constraints as well.
There’s no room to increase manufacturing volumes The major problem Apple and similar very high volume mobile players (such as Samsung) are facing is that the fabs are already running at maximum capacity. If demand suddenly goes up, as it has in the pandemic for not only consumer products but also for servers in cloud and enterprise installations, it’s very difficult for the contract fabs to increase output. And there’s no quick fix. Building a new fab can take 2-3 years and can cost $10 billion-$20 billion, so even with all the announcements lately of companies committing to new fabs (e.g., Intel, Samsung, TSMC) and even with government incentives from the US and other countries, you can’t simply turn on a new manufacturing line in short order. The issue is further compounded by the fact that much of the increased demand is in new-generation chips that can’t readily be run on older production lines that may have capacity to spare; and those lines would be too costly and take too long to retrofit. There is talk of expanding production sites to other areas of the world (e.g., India, mainland China), but starting up totally new production from scratch, and with companies that are new to the game, is a slow process.
How long will this be an issue? So will all of this have a long-term effect on companies like Apple and Samsung, and potentially Google, AWS, Microsoft, and other cloud providers, as well as cutting edge companies like Nvidia? It will likely take at least 18-24 months to stabilize the supply chain, unless there is a sudden case of markets shrinking due to some catastrophe or economic collapse (that’s unlikely but possible). Until this gets resolved, we can expect to see many companies negatively affected (some more, some less) by the chip shortage, even while all the chip makers race to add capacity. But it’s not all gloom for semiconductor-related companies; it’s a great time to be a chip manufacturing equipment supplier! What’s an enterprise to do? The effect on enterprise customers will be varied. In the short term, you can expect to see shortages of some products like PCs, and even Chromebooks, as well as some high end mobile devices. They’ll be available, but perhaps not in the numbers or at the discounted prices many enterprises are used to. In data center servers, organizations can expect to see increased delivery times, so getting orders in ahead of the curve of when you’ll actually need them would be a wise move. Public clouds (e.g., AWS, Google Cloud Platform, Microsoft Azure) should not be dramatically affected as they have a good deal of capacity and can usually get new supplies of computers in a priority fashion, but some of the custom chip solutions they are deploying (e.g., AWS Graviton) may be impacted. Finally, enterprises should be aware that “business as normal” in procuring computing systems may not return to normal for several quarters at least. Plan accordingly.
Jack Gold is the founder and principal analyst at J.Gold Associates, LLC., an information technology analyst firm based in Northborough, MA., covering the many aspects of business and consumer computing and emerging technologies. Follow him on Twitter @jckgld or LinkedIn at https://www.linkedin.com/in/jckgld.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,713 | 2,019 |
"India is turning its back on Silicon Valley | VentureBeat"
|
"https://venturebeat.com/2019/02/16/india-is-turning-its-back-on-silicon-valley"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Feature India is turning its back on Silicon Valley Share on Facebook Share on X Share on LinkedIn Indian Prime Minister Narendra Modi speaks during the official opening of the Hannover Messe industrial trade fair in Hanover, central Germany on April 12, 2015.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
In 2014, a year after Amazon launched its shopping site in India, CEO Jeff Bezos made a splash visit to the nation. Dressed in formal Indian wedding attire, he climbed up a brightly painted truck with an oversized $2 billion check.
Bezos couldn’t contain his excitement about India, home to the second-largest internet population. “We have surpassed our highest expectations for how fast Amazon India has grown,” he said in a televised interview around the time. When asked what he thought of the local regulations in the country, Bezos responded, “India is a very good place to do business in.” For years, India has wanted foreign companies to thrive in the country. When the Bharatiya Janata Party (BJP) took power in 2014, one of its early major pushes was to formulate plans and structure incentives to attract foreign investment. In 2015, Prime Minister Narendra Modi unveiled plans to liberalize the foreign investment rules.
“India is unstoppable on the path of economic progress [and] … wants the world to see the tremendous opportunities it offers,” he said, ahead of a trip to the U.K. He also visited the U.S. and met with top Silicon Valley executives , nearly all of whom subsequently expanded their commitments in India.
Above: India’s Prime Minister Narendra Modi with Microsoft’s Satya Nadella, Google’s Sundar Pichai, Cisco’s John Chambers, Adobe’s Shantanu Narayen, and Qualcomm’s Paul Jacobs in San Jose, California on September 26, 2015. (Image: Press Information Bureau) Later, the government was accused of turning a blind eye to the creative ways Amazon was violating FDI norms (ecommerce policies) in India. It further introduced lofty incentives to encourage companies to participate in Make in India and Digital India , a set of state-run initiatives to drive job growth in the nation.
Change of mood But over the past year, in the run-up to the general elections in May, the Indian government has unveiled — and in many cases, enforced — a wave of sweeping changes. It now dictates how foreign companies handle and make use of Indian user data and other aspects of how ecommerce platforms operate, and it is working on introducing greater oversight for technology platforms.
In April, the Indian government issued a regulatory directive that required U.S. payment firms to store financial data of Indian users locally. The government said it needed the U.S. giants to comply to ensure “better monitoring” and added that “it is important to have unfettered supervisory access to data stored with these system providers, as also with their service providers / intermediaries / third-party vendors and other entities in the payment ecosystem.” MasterCard and Visa, the two biggest card networks in the U.S., as well as lobby groups that represent Google and Facebook — both of which offer payment solutions in India — spent months fiercely opposing the directive. They were joined by as many as 30 U.S. senators, who urged India to rethink its stand on data localization.
“We see this (data localization) as a fundamental issue to the further development of digital trade, and one that is crucial to our economic partnership,” the senators wrote in August.
Despite intense lobbying and public outcry from various corners of the U.S. government, India refused to extend the six-month deadline. With no choice left, American giants complied with the directive in October of last year. Little did these companies know, the Indian government was just getting started.
In late December, India revised its ecommerce policies to levy new restrictions on how Amazon and Walmart-owned Flipkart sell products in India. The two ecommerce companies are still scrambling to minimize the damage these revised policies have inflicted on their businesses.
Tens of thousands of items — if not more — disappeared from Amazon and Flipkart overnight when the revised policies went into effect earlier this month.
And these products remained unavailable as the companies began to cut ties with sellers in which they had a financial stake (per the new policies, a foreign player cannot be affiliated with sellers it conducts business with) and worked to restore some items.
Amazon and Walmart, which made a massive $16 billion bet on India last year , are not the only Silicon Valley companies to have found themselves on the receiving end of what many describe as increasingly hostile regulations introduced by the government.
Lobby groups that represent U.S. companies and industry watchers say they see an extreme shift from the “warm, welcoming, collaborative” approach the government exhibited in 2014. “In the past year or so, the engagement has been combative, with abrupt, disruptive policy changes that are being held without consultation, and, unusually, with absolutely no room for negotiation or even deadline extensions — as we saw with data localisation and FDI in ecommerce,” Prasanto K Roy, a technology and policy analyst, told VentureBeat.
Last month, Aruna Sundararajan, secretary of India’s Telecommunications department, told a group of Indian startups that the government was working to formulate a new “national champion” policy to encourage the “rise of Indian companies.” A secretary of Sundararajan declined to comment on this upcoming policy.
This week, the Indian government began finalizing a regulatory directive detailing how it wants intermediaries (internet service providers, websites, apps, and services that rely on users to generate content) to operate in the nation. Any entity that has more than 5 million users in India will have to set up a local office, appoint leaders in the nation who would be deemed responsible for any trouble their platforms incurs in the nation, and build automated tools to identify and remove harassing, hateful, and harmful content. (This would significantly limit or end the so-called safe harbor laws in India that Silicon Valley companies enjoy in many nations. Under these laws, a tech platform isn’t held liable for any issue it creates as long as it takes some actions in good faith to fight bad actors. Some critics have likened India’s move to censorship in China.
) Nationalism à la China Nationalism is the theme tying together all the policy changes the Indian government has unveiled in the past year. “All these moves are aligned with rising nationalism in the run-up to May 2019 and are often further aligned with, possibly driven by, specific lobbies who are simply riding the nationalism wave,” Roy said.
Above: Mukesh Ambani, chairman and managing director of Reliance, speaks at Vibrant Gujarat Global Summit, at Mahatma Mandir Exhibition cum Convention Centre, on January 18, 2019 in Gandhinagar, India.
Perhaps no one who benefits from these moves more than Mukesh Ambani, the richest man in India and owner of Reliance Industries, the largest industrial house in the nation. As Amazon and Walmart were meeting with Indian government officials to ask for an extension for the revised ecommerce policy deadline, Ambani, an ally of Modi, announced that Reliance Retail, the largest retailer in the nation, is entering the ecommerce space.
In his announcement before a group of merchants and government officials, including Modi, Ambani said India needs to “collectively launch a new movement against data colonization,” similar to the movement Mahatma Gandhi led against political colonization of India. “For India to succeed in this data-driven revolution, we will have to migrate the control and ownership of Indian data back to India — in other words, Indian wealth back to every Indian,” he said, adding: “Honorable Prime Minister, I am sure you will make this one of the principal goals of your Digital India mission.” The changes in the ecommerce policy, which put restrictions only on foreign capital, have huge implications for the $670 billion retail market.
“It’s about making it difficult for foreign companies to operate in this market. Whereas, if you look at Ambani or Kishore Biyani [founder and CEO of Future Group, one of the largest brick-and- mortar retailers in India] or any other Indian who wants to operate in the market, they can continue to play with whatever norms they want. They are not governed by any of these concerns,” Nikhil Narendran, partner at law firm Trilegal, told VentureBeat.
The idea that the Indian government should favor domestic companies has been years in the making. Executives with Indian smartphone vendors, which once ruled the local market , urged the government two years ago to help them fight the onslaught from Chinese vendors.
Around the same time, Sachin Bansal of Flipkart and Bhavish Aggarwal of ride-hailing firm Ola, suggested that India should replicate China’s model.
“What we need to do is what China did (15 years ago) and tell the world ‘We need your capital, but we don’t need your companies’,” said Bansal. Many have argued that India (and other countries) should focus on building its own ecosystem of companies, instead of giving it all away to Silicon Valley giants. Vivek Wadhwa, a distinguished fellow at Carnegie Mellon University and Harvard Law School, recently argued such a case.
Wadhwa said India’s recent moves are “steps in the right direction,” though he added that the policies need some fine-tuning.
Some of the Indian government’s recent asks — including data localization — are arguably fair. “The government needs to have data in a reasonably fast manner,” Narendran said. But the way the Indian government wants to get there would require tech companies to reconfigure their global infrastructure, he said.
In the past, “such needs were balanced by reasonable protections, such as the intermediary liability safe harbor,” Roy explained. “We’re seeing that counterbalance of reason being lost to rising nationalism and fear-mongering.” Facebook-owned WhatsApp, which will also have to abide by any intermediary regulation, has been facing some crucial challenges in India, its biggest market, for more than a year. The Indian government has been pushing WhatsApp to bring “traceability” to its platform so that it can find the origin of questionable content on the platform. At a media briefing in New Delhi earlier this month , Carl Woog, WhatsApp’s head of communications, reiterated that WhatsApp is committed to offering end-to-end encryption to users in India.
Hundreds of millions of users, but little revenue As China continued to put up barriers to U.S. companies, India emerged as one of the last great markets for Silicon Valley companies. For years, Google, Facebook, and others have looked to India for their next billion users, Kunal Shah, founder of mobile payment service FreeCharge and finance service CRED, said at an event last month.
“All the global companies love to come to India because it is the farm of MAUs [monthly active users]. Facebook and Google love to give free internet here, because it creates these large MAUs that they can monetize on the global market. They don’t care about ARPUs (average revenue per user). And ARPU of a person in this country is nothing next to that of a user in the global market,” he added.
Silicon Valley companies have invested billions of dollars in the nation. In terms of user counts, their bets seem to be paying off. Both Facebook and Google have more than 250 million users in India. These companies, along with Amazon, identify India as their fastest-growing market.
But revenue-wise, India’s contributions to their bottom line is a blip, at best. Google generated $1.4 billion in revenue in India in the year that ended March 2018, compared to the $110.9 billion it generated globally. (The India-specific financial details are based on figures provided to us by Paper.vc, a research firm in India that tracks regulatory filings.) During this period, Facebook posted $78 million in India revenue, compared to the $39.9 billion it reported globally. Amazon’s revenue in India, where it has invested $5.5 billion to date , stood at $754.2 million, compared to $177 billion globally.
Roy cautioned that the new regulatory push by the Indian government could make the nation less compelling to foreign companies. “If companies and their investment dollars are drawn in with promises of a progressive, competitive regime and then the laws suddenly change to stack the deck against them, perhaps supporting local companies, then clearly it’s less appealing for those companies to invest further — or for newer companies to invest,” he said.
And if India becomes less compelling for foreign companies, the resulting conditions could hurt Indian companies — contrary to the Indian government’s hopes. “Startups are not getting their funds from Ambani or Biyani. A large portion of their funding is coming from foreign VC firms.
Moving forward, startups in the nation will have to become more reliant on domestic funding,” Narendran said.
In the coming months, we will find out how badly Silicon Valley companies want to win in India.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,714 | 2,020 |
"How ISPs are using AI to address the coronavirus-driven surge in traffic | VentureBeat"
|
"https://venturebeat.com/2020/03/27/how-isps-are-using-ai-to-address-the-coronavirus-driven-surge-in-traffic"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How ISPs are using AI to address the coronavirus-driven surge in traffic Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
This month, under the strain millions of people self-quarantined by COVID-19 have placed on broadband infrastructure, Facebook, Disney, Microsoft, Sony, Netflix, and YouTube agreed to temporarily reduce download speeds and video streaming quality in countries around the world. Nearly 90 out of the top 200 U.S. cities saw internet speeds decline in the past week, according to BroadbandNow. And Akamai found that global traffic on March 18 was running 67% higher than the typical daily average.
As a result of government and employer mandates to “shelter in place” and work remotely from home, internet subscribers are consuming more bandwidth than during the holidays and sporting events like the Super Bowl. At the same time, ISPs are under regulatory and consumer pressure to maintain a baseline quality of service. According to new research from Park Associates, 76% of households say it would be difficult to go without broadband. And in March, FCC chair Ajit Pai introduced the Keep Americans Connected Pledge, a telecom industry measure that asks companies to prioritize connectivity for essential services.
Internet service providers have taken steps to ensure that internet demand doesn’t overwhelm capacity. Beyond capital improvements, some — including Verizon, AT&T, Vodafone, Cox, and Telstra — are employing AI and machine learning to service networks strained by the traffic surges. Others aren’t — when reached for comment, Comcast, CenturyLink, and Fiber said they’re not using AI for network management.
Verizon Verizon told VentureBeat that it taps AI and machine learning to respond to shifts in usage, like the 75% increase in gaming traffic it saw from March 10 to 17.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Analyzing patterns found in performance data, sensors, and alerting functions across all network platforms helps us identify performance issues before they impact the customer,” said a spokesperson via email. “For instance, based on analyzing patterns of performance, we are able to determine when parts of the network may need maintenance or replacement before a failure occurs and are able to roll a tech [person] to the scene or schedule that work into upcoming planned maintenance, saving on an extra trip.” Verizon’s predictive algorithms monitor more than 4GB of data streaming every second from millions of network interfaces spanning everything from customers’ routers to sensors gathering temperature and weather data. The carrier’s analytics infrastructure allows it to predict 698 “customer-impacting” events before they happen and take steps to prevent them from occurring. On home networks, Verizon automates testing on a sample of over 60,000 in-home routers every two hours, ensuring that customers receive the speed of service they pay for — even with gaming, VPN, web traffic, and video traffic increasing 75%, 34%, 20%, and 12% week-over-week , respectively.
“We’re in an unprecedented situation,” said Verizon chief technology officer Kyle J. Malady. “We expect these peak hour percentages to fluctuate, so our engineers are continuing to closely monitor network usage patterns 24/7 and stand ready to adjust resources as changing demands arise. We continually evaluate peak data usage times and build our networks to stay ahead of that demand. While it is not clear yet how having millions of additional people working from home will impact usage patterns, we remain ready to address changes in demand, if needed.” AT&T AT&T didn’t respond to VentureBeat for comment, but it previously said it uses a range of software-defined networking and network function virtualization technologies to mitigate spikes in network usage. One of these is what the company calls Enhanced Control, Orchestration, Management & Policy (ECOMP), which represents 8.5 million lines of code and supports over 100 different virtual network functions at all layers of AT&T’s network.
Like Verizon, AT&T applies predictive algorithms to its network to anticipate when hardware could potentially suffer downtime in the next days, weeks, or months. Historical analysis and pattern recognition also help optimize and route or reroute traffic. Separately, AT&T uses AI to manage its third-party cloud arrangements, such as with Microsoft and within its internal cloud and hybrid clouds. And on the mobile side, AI is aiding company technicians charged with spotting damage in cell towers from drone footage.
AT&T Labs vice president of advanced technology systems Mazin Gilbert says AT&T’s network-level AI can pick up signals indicating oncoming failures from vehicles in its repair fleet. In the future, he expects it’ll play a bigger role, potentially laying the groundwork for self-repairing systems.
“The network can’t be just software,” said Gilbert at the TM Forum Action Week conference in September 2019. “The network needs to be autonomous and pretty much zero-touch. It needs intelligence to know when it repairs itself, when it secures itself. The network needs to be contextual, personalized … [W]e have built these templates of intelligent agents. These are nothing more than closed-loop systems — closed-loop systems that capture data that can be configured for different problems. We push those in our network to collect data.” Vodafone Across the pond, Vodafone uses a cloud-based system called Neuron to generate network insights in real time. It’s built on top of Google Cloud with centralized access to data from over 600 servers in 11 countries, and it allows management to make decisions and take automated actions to improve service. For instance, Neuron can automatically assign more capacity in busy parts of the network while reducing capacity in parts that don’t require it.
Neuron is an evolution of a trial system Vodafone deployed to its mobile network in Germany with Huawei in 2017, dubbed Centralized Self-Organized Network (C-SON). C-SON identified the optimal settings to deliver voice over LTE services across 450 mobile cell sites chosen at random in four hours, a task that would have taken an engineer 2.5 months to perform manually. That same year, Vodafone’s Ireland subsidiary and Cisco teamed up to predict locations where 3G traffic will peak in the following hour, resulting in an average 6% improvement in mobile download speed and lower inference at the cell sites. And in Spain, Vodafone Spain piloted a system from Huawei and Ericsson that automatically chose the best frequency or node for each mobile connection.
“Neuron serves as the foundation for Vodafone’s data ocean and the brains of our business as we transform ourselves into a digital tech company,” said Vodafone group head of big data delivery Simon Harris. “Not only [can] we … gain real-time analytics capabilities across Vodafone products and services, [but we can] arrive at insights faster, which can then be used to offer more personalized product offerings to customers and to raise the bar on service.” Vodafone — which reports that some of its networks have seen a 50% traffic uptick from the beginning of March — intends to use Neuron and other diagnostic tools to increase capacity where it’s needed and absorb new usage patterns. “Vodafone will be expanding capacity to manage this demand as much as possible,” said the company in a statement.
“We also want to ensure that any congestion in the network does not negatively impact mission-critical and other essential communications during this period, such as for voice and digital access to health and education, or the ability for people to work from home.” Cox Cox, which serves 3.5 million internet subscribers in the U.S., says its management and service assurance strategy includes virtualizing portions of its network to “proactively and reactively” solve customer and network issues. The company’s software-defined networking capabilities tap AI to drive traffic optimization in the network backbone, delivering efficiencies in routing, latency, and resiliency in failover events.
“[W]e’re keeping a close eye at the individual node level to make sure we don’t approach any congestion thresholds and need to make any adjustments,” a spokesperson told VentureBeat. “Our focus is to help keep everyone connected during this unprecedented time, with remote workers and students learning from home top of mind … Similar to our normal process, if we see the network reach or exceed utilization thresholds, we will accelerate network upgrade plans in the impacted areas.” Telstra Last year, Australian telecom provider Telstra began tapping AI to predict equipment failures on its network — and the company told IT News that it continues to do so. Telstra’s use of AI and machine learning extends to load balancing insofar as predictive models help reprioritize congested resources, ensuring customers on the network are minimally impacted.
Telstra, which said it would bring forward a $500 million capital expenditure from early 2021 to increase its network capacity during the pandemic, recently lifted data caps on home broadband customers until the end of April. “The data, which will be provided automatically, will help facilitate videoconferencing, voice over Wi-Fi, and cloud connectivity, all important tools when working from home or in isolation,” said CEO Andrew Penn in a statement.
Uncertainty ahead By and large, ISPs that have deployed AI to manage COVID-19-related traffic surges are optimistic about the future. But they’re in uncharted waters.
According to broadband testing service Ooka, last week broadband speeds declined 4.9% from the previous week. Median download speeds dropped 38% in San Jose, California and 24% in New York. “Streaming platforms, telecom operators, and users, we all have a joint responsibility to take steps to ensure the smooth functioning of the internet during the battle against the virus propagation,” said Thierry Breton, the European Commission’s Internal Market Commissioner, in a statement.
AI might help — and already has helped — with respect to capacity. But it seems likely that for the foreseeable future, networks will be vulnerable to spikes in demand. AI can only do so much when what the world really needs is serious infrastructure investment.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,715 | 2,020 |
"Google announces $10 billion 'digitization fund' for India | VentureBeat"
|
"https://venturebeat.com/2020/07/13/google-announces-10-billion-digitization-fund-for-india"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Google announces $10 billion ‘digitization fund’ for India Share on Facebook Share on X Share on LinkedIn Alphabet CEO and Google CEO Sundar Pichai Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
( Reuters ) — Alphabet’s Google today said it will spend around $10 billion in India over the next five to seven years through equity investments and tie-ups, marking its biggest commitment to a key growth market.
The investments will be done through a “digitization fund,” highlighting Google’s focus on the rapid pace of growth of apps and software platforms in India, one of the world’s biggest internet services markets.
“We’ll do this through a mix of equity investments; partnerships; and operational, infrastructure, and ecosystem investments,” Alphabet CEO Sundar Pichai said on a webcast during the annual “Google for India” event.
“This is a reflection of our confidence in the future of India and its digital economy.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Google has already made some direct and indirect investments in Indian startups, such as local delivery app Dunzo.
Beyond investments via the fund, Google would also focus on areas like artificial intelligence and education in India, Pichai told Reuters in an interview.
Indian-born Pichai joined Google in 2004 and is widely credited with making the Chrome browser. He replaced company cofounder Larry Page as CEO of parent Alphabet last year.
“Sundar Pichai, who is heading Google, is a very powerful symbol of the creative potential of India’s human resource,” India’s technology minister Ravi Shankar Prasad said at the event.
The U.S. tech group, whose Android mobile operating system powers the bulk of India’s roughly 500 million smartphones, will continue to work with manufacturers to build low-cost devices so that more and more people can access the internet, another Google executive said.
( Reporting by Sankalp Phartiyal, additional reporting by Sachin Ravikumar, editing by Jason Neely and Jane Merriman.
) VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,716 | 2,021 |
"Investments are fueling the evolution of IT services | VentureBeat"
|
"https://venturebeat.com/2021/02/27/investments-are-fueling-the-evolution-of-it-services"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Analysis Investments are fueling the evolution of IT services Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
IT service providers, and the vendors they rely on, are scrambling to raise capital in anticipation of a major shift in the way IT will be consumed and managed in the wake of the COVID-19 pandemic. At a time when more organizations than ever are willing to rely on external service providers to reduce their IT costs, these providers of IT services need to accelerate their transition to cloud-based platforms.
That shift requires a significant amount of investment: IT tech support provider Electric AI announced this week it has raised $40 million in series C funding to advance the adoption of a managed IT service for small to medium-size businesses (SMBs).
However, it’s not just IT services providers that are looking for funding. The providers of platforms that many IT services providers rely on are also raising capital.
Atera , a provider of a platform for delivering managed services, this week announced it has raised $25 million from K1 Investment Management.
At the same time, ScienceLogic , a provider of an IT platform employed by both IT services providers and internal IT teams, announced it has raised $105 million as part of an effort to infuse more AIOps capabilities into its platform.
Historically, IT services providers have relied on client/server platforms provided by third-party vendors such as ConnectWise, Kaseya, SolarWinds, and at the higher end of the market, ScienceLogic. In many cases, however, those platforms have proven to be cumbersome not only to master and manage, but also to extend.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Warwick Burns, owner of Warwick Data Solutions in Nashville, Tennessee, opted to rely on Atera’s cloud platform as an alternative to a rival offering from ConnectWise because, as a small provider of IT services, the company doesn’t have the time and resources required to learn and maintain a complex platform. “We learned how to use the Atera platform in a day,” Burns said. “The other platforms are a big clunky mess.” That issue creates a significant opportunity to usurp the incumbent providers of platforms that are widely employed by IT services providers, Atera CEO Gil Pekelman said. The Atera platform is a cloud-based offering that is designed to integrate remote management and monitoring (RMM) and professional services automation (PSA) capabilities that IT service providers require to manage multiple clients in a way that is more accessible, said Pekelman.
In contrast, rivals are stitching together technologies they have acquired to provide similar capabilities using a legacy client/server architecture that they continue to try to extend, Pekelman said. Atera will employ its latest round of funding to provide additional analytics to the data its platform collects to enable IT services providers to become more efficient, said Pekelman. “Our IP is our software and our data,” he said.
In a similar vein, Augmentt has emerged as a startup focused on enabling IT service providers to manage multiple software-as-a-service (SaaS) applications on behalf of their customers. As organizations have shifted toward relying more on SaaS applications in the wake of the COVID-19 pandemic, Augmentt chairman Gavin Garbutt said it became apparent IT services providers needed a platform designed for the ground up to manage SaaS operations. “There was no RMM tool for SaaS applications designed for IT service providers,” Garbutt said.
Electric, based in New York, has pursued a different tack. The IT services provider has poured significant resources into extending IT management platforms from Kaseya and Jamf to provide services for Windows and Apple platforms, respectively. It developed software to streamline workflow processes using its own automation framework to create a self-service framework through which end customers can provision applications with no intervention required from the IT service provider, said Electric CEO Ryan Denehy.
“We’re providing customers with a more modern experience,” Denehy said.
In the case of Electric, the company made the decision to write software to extend existing backend IT management platforms, while Warwick Data Solutions, in the absence of any in-house software development capabilities, opted for a new platform.
Regardless of the platform, IT service providers will also be at the forefront of modernizing the management of IT using, for example, AIOps. Make that shift will require increased reliance on cloud platforms that make the data required to train AI models more accessible. The decision that business and IT leaders will be making essentially comes down to betting on how long it will take for one IT services provider, compared to another based on the resources they have available, to ultimately move down that path.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,717 | 2,021 |
"IBM's Rob Thomas details key AI trends in shift to hybrid cloud | VentureBeat"
|
"https://venturebeat.com/2021/03/19/ibms-rob-thomas-details-key-ai-trends-in-shift-to-hybrid-cloud"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages IBM’s Rob Thomas details key AI trends in shift to hybrid cloud Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The last year has seen a major spike in the adoption of AI models in production environments, in part driven by the need to drive digital business transformation initiatives. While it’s still early days as far as AI is concerned, it’s also clear AI in the enterprise is entering a new phase.
Rob Thomas, senior vice president for software, cloud, and data platform at IBM, explains to VentureBeat how this next era of AI will evolve as hybrid cloud computing becomes the new norm in the enterprise.
As part of that effort, Thomas reveals IBM has formed a software-defined networking group to extend AI all the way out to edge computing platforms.
This interview has been edited for brevity and clarity.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! VentureBeat: Before the COVID-19 pandemic hit, there was a concern AI adoption was occurring slowly. How much has that changed in the past year? Rob Thomas: We’ve certainly got massive acceleration for things like Watson Assistant for customer service. That absolutely exploded. We had nearly 100 customers that started and then went live in the first 90 days after COVID hit. When you broaden it out, there are five big use cases that have come up over the last year. One is customer service. Second is around financial planning and budgeting. Thirdly are things such as data science. There’s such a shortage of data science skills, but that is slowly changing. Fourth is around compliance. Regulatory compliance is only increasing, not decreasing. And then fifth is AI Ops. We launched our first AI ops product last June and that’s exploded as well, which is related to COVID in that everybody was forced remote. How do we better manage our IT systems? It can’t be all through humans because we’re not on site. We’ve got to use software to do that. I think that was 18 months ago, I wouldn’t have given you those five. I would have said “There’s a bunch of experimentations.” Now we see pretty clearly there are five things people are doing that represent 80% of the activity.
VentureBeat: Should organizations be in the business of building AI or should they buy it in one form or another? Thomas: I hate to be too dramatic, but we’re probably in a permanent and a secular change where people want to build. Trying to fight that is a tough discussion because people really want to build. When we first started with Watson, the idea was this is a big platform. It does everything you need. I think what we’ve discovered along the way is if you componentize to focus where we think we’re really good, people will pick up those pieces and use them. We focused on three areas for AI. One is natural language processing (NLP). I think if you look at things like external benchmarks, we had the best NLP from a business context. In terms of document understanding, semantic parsing of text, we do that really well. The second is automation. We’ve got really good models for how you automate business processes. Third is trust. I don’t really think anybody is going to invest to build a data lineage model, explainability model, or bias detection. Why would a company build that? That’s a component we can provide. If you want them to be regulatory compliant, you want them to have explainability, then we provide a good answer for that.
VentureBeat: Do you think people understand explainability and the importance of the provenance of AI models and the importance of that yet? Are they just kind of blowing by that issue in the wake of the pandemic? Thomas: We launched the first version of what we built to address that around that two years ago. I would say that for the first year we got a lot of social credit. This changed dramatically in the second half of last year. We won some significant deals that were specifically for model management explainability and lifecycle management of AI because companies have grown to the point where they have thousands of AI models. It’s pretty clear, once you get to that scale, you have no choice but to do this, so I actually think this is about to explode. I think the tipping point is once you get north of a thousandish models in production. At that point, it’s kind of like nobody’s minding the store. Somebody has to be in charge when you have that much machine learning making decisions. I think the second half of last year will prove to be a tipping point.
Above: IBM senior VP of software, cloud, and data Rob Thomas VentureBeat: Historically, AI models have been trained mainly in the cloud, and then inference engines are employed to push AI out to where it’d be consumed. As edge computing evolves, there will be a need to push the training of AI models out to the edge where data is being analyzed at the point of creation and consumption. Is that the next AI frontier? Thomas: I think it’s inevitable AI is gonna happen where the data is because it’s not economical to do the opposite, which is to start everything with a Big Data movement. Now, we haven’t really launched this formally, but two months ago I started a unit in IBM software focused on software-defined networking (SDN) and the edge. I think it’s going to be a long-term trend where we need to be able to do analytics, AI, and machine learning (ML) at the edge. We’ve actually created a unit to go after that specifically.
VentureBeat: Didn’t IBM sell an SDN group to Cisco a long time ago now? Thomas: Everything that we sold in the ’90s was hardware-based networking. My view is everything that’s done in hardware from a networking at the edge perspective is going to be done in software in the next five to seven years. That’s what’s different now.
VentureBeat: What differentiates IBM when it comes to AI most these days? Thomas: There are three major trends that we see happening in the market. One is around decentralization of IT. We went from mainframes that are centralized to client/server and mobile. The initial chapter of public cloud was very much a return to a centralized architecture that brings everything to one place. We are now riding the trend that says that we will decentralize again in the world that will become much more about multicloud and hybrid cloud.
The second is around automation. How do you automate feature engineering and data science? We’ve done a lot in the realm of automation. The third is just around getting more value out of data. There was this IDC study last year that 90% of the data in businesses is still unutilized or underutilized. Let’s be honest. We haven’t really cracked that problem yet. I’d say those are the three megatrends that we’re investing against. How does that manifest in the IBM strategy? In three ways. One is we are building all of our software on open source. That was not the case two years ago. Now, in conjunction with the Red Hat acquisition, we think there’s room in the market for innovation in and around open source. You see the cloud providers trying to effectively pirate open source rather than contribute. Everything we’re doing from a software perspective is now either open source itself or it’s built on open source.
The second is around ecosystem. For many years we thought we could do it ourselves. One of the biggest changes we’ve made in conjunction with the move to open source is we’re going to do half of our business by making partners successful. That’s a big change. That why you see things like the announcement with Palantir. I think most people were surprised. That’s probably not something we would have done two years ago. It’s kind of an acknowledgment that all the best innovation doesn’t have to come from IBM. If we can work with partners that have a similar philosophy in terms of open source, that’s what we’re doing.
The third is a little bit more tactical. We announced earlier this year that we’ve completely changed our go-to-market strategy, which is to be much more technical. That’s what we’ve heard customers want. They don’t want a salesperson to come in and read them the website. They want somebody to roll up their sleeves and actually build something and co-create.
VentureBeat: How do you size up the competitive landscape? Thomas: Watson components can run anywhere. The real question is why is nobody else enabling their AI to run anywhere? IBM is the only company doing that. My thesis is that most of the other big AI players have a strategy tax. If your whole strategy is to bring everything to our cloud, the last thing you want to do is enable your AI to run other places because then you’re acknowledging that other places exist. That’s a strategy advantage for us. We’re the only ones that can truly say you can bring the AI to where the data is. I think that’s going to give us a lot of momentum. We don’t have to be the biggest compute provider, but we do have to make it incredibly easy for companies to work across cloud environments. I think that’s a pretty good bet.
VentureBeat. Today there is a lot of talk about MLOps, and we already have DevOps and traditional IT operations. Will all that converge one day or will we continue to need a small army of specialists? Thomas: That’s a little tough to predict. I think the reason we’ve gotten a lot of momentum with AI Ops is because we took the stuff that was really hard in terms of data virtualization, model management, model creation, and automated 60-70% of that. That’s hard. I think it’s going to be harder than ever to automate 100%. I do think people will get a lot more efficient as they get more models in production. You need to manage those in an automated fashion versus a manual fashion, but I think it’s a little tough to predict that at this stage.
VentureBeat: There’re a lot of different AI engines. IBM has partnered with Salesforce. Will we see more of that type of collaboration? Will the AI experience become more federated? Thomas: I think that’s right. Let’s look at what we did with Palantir. Most people thought of Palantir as an AI company. Obviously, they associate Watson with AI. Palantir does something really good, which is a low-code, no-code environment so that the data science team doesn’t have to be an expert. What they don’t have is an environment for the data scientist that does want to go build models. They don’t have a data catalog. If you put those two together, suddenly you’ve got an AI system that’s really designed for a business. It’s got low code, no code, it’s got Python, it’s got data virtualization, a data catalog. Customers can use that joint stack from us and will be better off than had they chosen one or the other and then tried to fix the things themselves. I think you’ll probably see more partnerships over time. We’re really looking for partnerships that are complementary to what we’re doing.
VentureBeat: If organizations are each building AI models to optimize specific processes in their favor, will this devolve into competing AI models simply warring with one another? Thomas: I don’t know if it’ll be that straightforward. Two companies are typically using very different datasets. Now maybe they’re both joining with an external dataset that’s common, but whatever they have is first-party data or third-party data that is probably unique to them. I think you get different flavors, as opposed to two things that are conflicting or head to head. I think there’s a little bit more nuance there.
VentureBeat: Do you think we’ll keep calling it AI? Or will we get to a point where we just kind of realize that it’s a combination of algorithms and statistics and math [but we] don’t have to necessarily call it AI? Thomas: I think the term will continue for a while because there is a difference between a rules-based system and a true learning machine that gets better over time as you feed it more data. There is a real distinction.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,718 | 2,014 |
"This is what Esri's doing with Geoloqi's location tech | VentureBeat"
|
"https://venturebeat.com/2014/02/20/this-is-what-esris-doing-with-geoloqis-location-tech"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages This is what Esri’s doing with Geoloqi’s location tech Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
With mapping firm Esri’s new software development kit, launched Wednesday, developers can create complex, polygonal “geofences” that trigger actions when people enter or leave them.
A restaurant chain could, for example, use its app to send a push notification to customers that walk close to one of its locations. Or it could notify employees when a customer who ordered delivery arrives to pick up her meal.
Esri snapped up location-focused startup Geoloqi in late 2012. The new “Geotrigger” SDK is the fruit of that acquisition: It enables iOS and Android developers to add more sophisticated location-based features to their mobile applications.
“Geotrigger Service opens up a whole world of use cases, from stores wanting to engage customers to cities wanting to release an app to send civic alerts, local event information, or tourism info,” said Amber Case, Geoloqi’s founder and head of Esri’s R&D center, in a statement. “Create an invisible button on a map, and when your phone gets within that button — that invisible region — something will happen. Your phone could even turn the lights on in your home as you pull into the driveway, and turn them off when you leave.” Esri also promises that its SDK minimizes the amount of time GPS and cellular chipsets need to be active, minimizing battery usage. Battery drain has been a barrier to the adoption of location-based features in mobile apps.
The service costs developers a tiny amount per geotrigger event. For more details, check out the video below.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,719 | 2,016 |
"5 ways developers can exploit geospatial tech in 2016 | VentureBeat"
|
"https://venturebeat.com/2015/12/26/5-ways-developers-can-exploit-geospatial-tech-in-2016"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest 5 ways developers can exploit geospatial tech in 2016 Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Since the rise of geospatial technology, applications like Facebook, Uber, and Grindr (where I work), have enabled users to engage with their surroundings to connect with friends, book a room, or set up a date.
But the application of geospatial technologies is really in its infancy. End users demand fast access and precise services that fit their interests, leaving most app developers playing catch up. Here are five tips to leverage the power of geospatial technologies to meet user demands in 2016: 1. Connect data science with geospatial services.
Many app developers settle at engaging users with sponsored content based on broad high-level editorial decisions. This practice can end up serving content that is poorly targeted, or even worse, turning users off. Core user data like what interests they have, which restaurants they hit last Friday, and which events they avoided over the weekend, offers huge data mining potential. Be smart: Prioritize investments in machine learning and hire data scientists to perform deep analysis. You’ll end up knowing more about your users’ interests than they know themselves. Connect this data with local services to provide a baseline for serving content that is best suited to your users.
2. Optimize for wearable technology.
Apple and Google have made huge investments into wearable technologies, but are end users really getting the most out of them? What seems to be a cool fitness monitor or a convenient way of checking and prioritizing email is actually a unique stream of behavioral data input for developers and a localized engagement output for end users. Think of it as another powerful geospatial data stream, from detecting and measuring user location patterns and cross-referencing user preferences to delivering recommendations that make perfect sense to the user’s current time and place. Make your marketing folks super happy; you won’t find nearly as many competitive offerings in wearable marketplaces as you will in overcrowded app stores.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! 3. Minimize sophisticated security threats.
DDoS attacks are a breeze compared to someone using your geospatial technology to segment, pinpoint, and even engage your users for malicious purposes. Whether it’s a lone wolf or well-financed group, understand and accept that these forces will try to figure out and use geospatial capabilities to take advantage of your users.
Invest in a two-pronged security approach: First secure the door by enhancing network security and data encryption to block attempts. Second, identify malicious user signatures by analyzing and detecting similar behavioral characteristics over time.
4. Curb spam and phishing abuse.
Geospatial capabilities are a dream come true for spammers and phishers that create real-life profiles, select and engage victims, and quickly lure in victims using artificial intelligence. The end result: a spiraling volume of users clicking and engaging with competitive services, dodgy scams, and fraudulent offers. Build intrusion detection systems that collect and analyze data points from users (i.e. who do they talk to and when) and correlate geospatial and phone sensor data (i.e. phone orientation and accelerometer input).
5. Harness the two-way value street of iOS and Android.
Many app developers use an iOS-centric design form for all UX/UI. That is, copy, paste, and patch iOS design schemes and styles to any mobile device OS. Android has many unique and powerful functions like tactile animations and floating buttons. Hire mutually exclusive experts in Android and iOS design and development and enable them to complement (not copy from) the other. While you’re at it, focus your UX/UI to accommodate more one-handed use — it’s happening more and more with the use of larger phones.
2016 will be all about predicting and anticipating consumer behavior. In order to take advantage of the engagement potential that geospatial technologies offer, you’ll need more and more data, both in user volume and metrics, to gain and maintain a competitive advantage. Go big with investments in machine learning and hire the smartest data scientists out there to analyze data sets to understand exactly what your users want, at the perfect time, and the perfect place.
Lukas Sliwka is the CTO at Grindr LLC. Follow him: @LukasRepublic.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,720 | 2,016 |
"Locate this! The battle for app-specific maps | VentureBeat"
|
"https://venturebeat.com/2016/02/13/locate-this-the-battle-for-app-specific-maps"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Locate this! The battle for app-specific maps Share on Facebook Share on X Share on LinkedIn Google Maps.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
It’s no longer easy to get lost. Quite the opposite, we expect and rely on maps for our most common Internet tasks from basic directions to on-demand transportation, discovering a new restaurant or finding a new friend.
And the battle is on between the biggest public and private companies in the world to shore up mapping data and geo-savvy engineering talent. From there, the race continues to deliver the best mapping apps.
In August, a consortium of the largest German automakers including Audi, BMW, and Daimler (Mercedes) bought Nokia’s Here mapping unit, the largest competitor to Google Maps, for $3.1 billion.
A New York Times story on the deal noted that, “amid a general scramble for talent, Google, the Internet search company, has undergone specific raids from unicorns for engineers who specialize in crucial technologies like mapping.” Wrapping our planet in mobile devices gave birth to a new geographic landscape, one where location meets commerce and maps play a critical role. In addition to automakers like the German consortium having a stake in owning and controlling mapping data and driver user experiences, the largest private companies, like Uber and Airbnb, depend on maps as an integral part of their applications.
That’s one reason purveyors of custom maps like Mapbox have emerged to handle mapping applications for companies like Foursquare, Pinterest, and Mapquest.
Mapbox raised $52.6 million last summer to continue its quest.
Mapbox and many others in the industry have benefitted from the data provided by OpenStreetMap , a collection of mapping data free to use under an open license. Of course some of the largest technology companies in the world besides Google maintain their own mapping units including Microsoft (Bing Maps) and Apple Maps.
Investment in the Internet of Things combined with mobile device proliferation are creating a perfect storm of geolocation information to be captured and put to use. Much of this will require an analytics infrastructure with geospatial intelligence to realize its value.
In a post titled, Add Location to Your Analytics, Gartner notes: The Internet of Things (IoT) and digital business will produce an unprecedented amount of location-referenced data, particularly as 25 billion devices become connected by 2020, according to Gartner estimates.
And more specifically: Dynamic use cases require a significantly different technology that is able to handle the spatial processing and analytics in (near) real time.
Of course, geospatial solutions have been around for some time, and database providers often partner with the largest private geospatial company, Esri, to bring them to market. In particular, companies developing in-memory databases like SAP and MemSQL have showcased work with Esri. By combining the best in geospatial functions with real-time, in-memory performance, application makers can deliver app-specific maps with unprecedented level of consumer interaction.
Google’s balloons and Facebook’s solar powered drones may soon eliminate the dead zones from our planet, perhaps removing the word “lost” from our vocabulary entirely. Similarly, improvements in interior mapping technology guarantee location specific details down to meters. As we head to this near-certain future, maps, and the rich, contextual information they provide, appear to be a secret weapon to delivering breakout application experiences.
Gary Orenstein is chief marketing officer at MemSQL.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,721 | 2,020 |
"How machine learning is identifying and tracking pandemics like COVID-19 | VentureBeat"
|
"https://venturebeat.com/2020/08/27/how-machine-learning-is-identifying-and-tracking-pandemics-like-covid-19"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored How machine learning is identifying and tracking pandemics like COVID-19 Share on Facebook Share on X Share on LinkedIn Presented by AWS Machine Learning In 2003, the SARS outbreak took the world by surprise. While it pales in comparison to the current pandemic, in short order the disease incapacitated countries around the world, infecting more than 8,000 people , and causing billions of dollars in damage.
“For me, the SARS outbreak was an eye-opening event,” says Dr. Kamran Khan, infectious disease physician, professor of medicine and public health at the University of Toronto, and founder and CEO of BlueDot. “I recognized that we’d never seen anything like it before, but there would be more outbreaks like this again in the future.” Khan spent the next 10 years studying infectious disease spread, looking for a way to better detect and respond to threats like SARS and the ones that followed.
By 2013, machine learning technology had advanced to the point where he was able to put his vision of a digital global warning system into action — and BlueDot was born. Now, the company’s machine learning algorithms use billions of data points from across a broad spectrum of sources to detect potential outbreaks, track current ones, and predict how the disease will continue to spread.
Powered by AWS, their ML platform anticipates the spread and impact of over 150 different pathogens, toxins, and syndromes in near-real time. With this critical information, they’re able to advise governments, public health organizations, and other clients on how to disrupt the threat of pandemics — and help get ongoing disease spread under control.
“Time is everything during an outbreak, and a pandemic is a global emergency,” Khan says. “The ability to quickly generate insights and get those insights out to the rest of the world is essential, and machine learning is key to that ability.” Fast-forward five years, and the world saw the arrival of the latest virus, the one that would change people’s lives on a global scale.
BlueDot first detected the coronavirus outbreak in Wuhan on December 31, 2019. It was just a few hours after the first cases were diagnosed by local authorities. With this early information, they were able to send out alerts almost a week before any official announcements were made by the Chinese government or international health organizations. This was just the beginning of their COVID-19 work.
The pandemic-fighting power of ML Pandemics pose a complex challenge — and the urgency to solve the problem is growing. BlueDot’s outbreak detection solution is unique, and particularly powerful, because of the way it combines public health and medical expertise with advanced data analytics and machine learning on the AWS. This enables them to track, contextualize, and anticipate infectious disease risks.
The company’s software consists of a machine learning platform that leverages billions of data points from a vast array of sources in over 65 languages. It’s constantly scanning foreign-language news reports, animal and plant disease networks, official government announcements, and more than 100 datasets with proprietary algorithms to identify new outbreaks.
AWS is key to processing all of this data, using custom machine learning algorithms that rely on natural language processing to make sense and structure all of the data. Using Amazon Elastic Compute (EC2) they can process massive amounts of unstructured text data into organized, structural, spatiotemporal pathogen data — identifying the space, time and name of the pathogen. For instance, the word “plague” might refer to an outbreak, or it might refer to a component of a fantasy video game. This is where subject matter experts have worked with data scientists to train the platform to process all of this information and organize it, so that the algorithm can differentiate the article that’s about the heavy metal band Anthrax from an actual outbreak of anthrax. The algorithm can also eliminate duplicates from among multiple stories being written about an event.
“We would need hundreds of people if we did this all manually,” Khan says. “This is where machine learning can allow us to process and make sense of this vast amount of unstructured data in all these various languages to find the metaphorical needles in the haystack.” Once the algorithms extract the place and the time of a potential outbreak, the platform adds context. It cross-references this information with other complementary data, such as how many people live in that area, where are the neighboring airports, are there direct flights out of the region, where do they go, and with how many passengers? What’s the temperature like? And so on, adding private sector data to the analysis.
BlueDot also incorporates anonymized air traffic data to follow the movement of passengers to anticipate where diseases might disperse around the planet, as well as anonymous location data from 400 million mobile devices worldwide.
With respect to the microbe, the algorithm can parse the data to identify what type it is, from flu or measles to dengue fever. And once the pathogen is identified, it can add their own internal knowledge of the disease, such as how it is spread, the clinical manifestation of the disease, whether there’s a vaccine, and what the mortality rate is.
Tracking COVID-19 BlueDot’s machine learning algorithm identified early news of pneumonia of unknown origin from Chinese news reports. The machine learning algorithms translated the text, analyzed the data, and alerted BlueDot scientists that a serious situation was beginning to brew in Wuhan.
The company’s experts in epidemiology, medicine, and public health confirmed that a potential outbreak, similar to the event that started in Guangdong province with the SARS outbreak, was occurring, and posed a legitimate threat. Then the location of the outbreak was cross-referenced using a variety of models that found where the neighboring airports were via spatial models and spatial analytics using GIS (geographic information systems).
It found the locations of the airports, automatically connected all of the flight data, and passenger-level data, and conducted an analysis to find all the potential destinations the disease could be spread to. Machine learning lets them track an outbreak like this continuously, and at scale, Khan says.
Above: Tracking Wuhan final destinations “For every single outbreak that appears in the world every day, we’re able to identify every other location on the planet that may be connected to it and should be aware of that particular event,” he explains. “That way we’re anticipating its potential arrival, not just responding or reacting to it when it shows up.” Concerned about the parallels with the SARS outbreak, the company’s scientists made their insights available to the broader public by publishing a peer-reviewed scientific paper, which appeared on January 13. It identified the places the outbreak could travel to next. Of the 20 cities the paper listed, 12 of those were among the first cities that were impacted by COVID-19. The number-one city on the list was Bangkok, and Bangkok was the first city in the world that had a case of COVID-19 reported as it spread outside of mainland China.
As cities started to go into lockdown, implementing stay-at-home orders to slow transmission of this virus, they were able to use mobile phone data to understand how well social distancing interventions were being adhered to. This allowed public health messages to be strategically targeted to the places most needing the message, and helped fight the disease on as many fronts as possible as countries begin to develop their reopening strategies.
The future of disease detection “We’re actively researching ways machine learning can better anticipate the spread, impact and consequence of global diseases,” Khan says. “Without a high-performance computing environment, it wouldn’t be possible to make sense of all this information.” Meanwhile, they’re not losing sight of Ebola activity in the Democratic Republic of Congo, or an outbreak of Lassa fever, or other types of diseases that can’t be ignored. This machine learning platform is critical to monitoring threats on an ongoing basis. From early detection, to tracking leaps across continents, to mitigating the spread in airports and local communities, this technology is the most powerful ammunition scientists have.
“We’re deep in the fight against COVID-19 now, but we can’t stop looking at the next threat,” Khan says. “While we turn our attention to mitigating the current pandemic, a machine can keep its eye on everything else happening around the world.” Dig deeper : See more ways machine learning is being used to tackle today’s biggest social, humanitarian, and environmental challenges.
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected].
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,722 | 2,011 |
"HP to acquire enterprise player Autonomy for $10.3B (updated) | VentureBeat"
|
"https://venturebeat.com/2011/08/18/report-hp-bidding-10b-to-acquire-enterprise-player-autonomy"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages HP to acquire enterprise player Autonomy for $10.3B (updated) Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
HP will buy UK-based enterprise software company Autonomy for $10.3 billion, the company announced during today’s quarterly earnings call.
Autonomy shareholders will receive $42.11 a share, which is a 64 percent premium over Autonomy’s closing share price yesterday. If the deal is approved by regulators, HP will be making a strong push into enterprise software.
“HP wants to make a dramatic entry into information management,” Whit Andrews, Gartner vice president and analyst, told VentureBeat. “This would be a very visible entry into the enterprise market with a focus on unstructured information.” HP also today confirmed it will explore spinning off its huge personal computing business.
Additionally, HP said it would kill off its webOS device unit , specifically the HP TouchPad tablet that has failed to sell many units.
The $10.3 billion Autonomy acquisition, if it goes through, would be one of the largest deals in HP’s history. In the past year, HP bought 3PAR for $2.35 billion as well as ArcSight for $1.5 billion.
“HP has been sitting on the sidelines partnering with software companies rather than being an enterprise software company for several years now,” Melissa Webster, IDC vice president of content and digital media technologies told VentureBeat. “With Autonomy, HP can make a strong bid with eDiscovery, archival software, and governance, risk, and compliance solutions.” The bid to buy Autonomy and to spin off its PC business would mean HP wants to be a major software and business solutions player. Bloomberg’s report said HP CEO Leo Apotheker has placed the company’s emphasis on software and services.
“HP has been struggling for years with how to shift into software,” Andrews said. “Buying Autonomy would give them a new direction for software with strong clients, strong software, and a strong position in the market.” HP’s stock price fluctuated wildly on the news , with its share price going up on a day when tech stocks were hit hard. But the price quickly retreated back sharply into the negative as traders digested the news, and the share price closed down 5.8 percent. In after hours trading, HP lost another 5 percent and it’s share price will likely continue to fluctuate.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,723 | 2,012 |
"Autonomy founder lashes out at HP over failed acquisition | VentureBeat"
|
"https://venturebeat.com/2012/11/27/autonomy-founder-strikes-out-at-hp-over-failed-acquisition"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Autonomy founder lashes out at HP over failed acquisition Share on Facebook Share on X Share on LinkedIn Autonomy founder Michael Lynch and HP, which acquired the company last year, are trading public statements over alleged financial improprieties at Autonomy that led HP to write off $8.8 billion recently.
Michael Lynch, the founder of Autonomy, is publicly trading blows with HP.
HP acquired Autonomy for $11.1 billion in 2011, but wrote down $8.8 billion of that earlier this month after it discovered what it called “ serious accounting improprieties ” in Autonomy’s books.
“I utterly reject all allegations of impropriety,” Lynch wrote in an open letter to the board published today.
Lynch is not the only one defending his reputation. The allegations have sent everyone associated with the deal running for cover. Former HP CEO Léo Apotheker, who drove the deal, said he was “stunned” to learn there were issues.
Lynch’s letter has a string of detailed questions about how things could have gone so badly wrong, whether they were known before the acquisition, and how accounting shenanigans could have accounted for $5 billion of that $8.8 billion write-down, as HP alleges. His questions also cast doubt on HP’s management of Autonomy, post-acquisition, and suggest that he has no idea where HP’s allegations are coming from.
“I have been truly saddened by the events of the past months, and am shocked and appalled by the events of the past week,” Lynch concludes.
HP wasted no time in firing back with a brief statement of its own. In a nutshell: We’ve handed this over to the authorities, and have nothing more to say at this time.
“While Dr. Lynch is eager for a debate, we believe the legal process is the correct method in which to bring out the facts and take action on behalf of our shareholders,” said HP’s statement. “In that setting, we look forward to hearing Dr. Lynch and other former Autonomy employees answer questions under penalty of perjury.” Both parties have everything at stake.
HP, which has been flailing lately, is fighting for its life.
And Lynch, while he may have a lot of cash in the bank from his successful exit, has his professional reputation on the line.
The gloves are off, folks.
A copy of HP’s letter is below, via AllThingsD , which published it earlier today.
On 20 November Hewlett-Packard (HP) issued a statement accusing unspecified members of Autonomy’s former management team of serious financial impropriety. It was shocking that HP put non-specific but highly damaging allegations into the public domain without prior notification or contact with me, as former CEO of Autonomy.
I utterly reject all allegations of impropriety.
Autonomy’s finances, during its years as a public company and including the time period in question, were handled in accordance with applicable regulations and accounting practices. Autonomy’s accounts were overseen by independent auditors Deloitte LLC, who have confirmed the application of all appropriate procedures including those dictated by the International Financial Reporting Standards used in the UK.
Having no details beyond the limited public information provided last week, and still with no further contact from you, I am writing today to ask you, the board of HP, for immediate and specific explanations for the allegations HP is making. HP should provide me with the interim report and any other documents which you say you have provided to the SEC and the SFO so that I can answer whatever is alleged, instead of the selective disclosure of non-material information via background discussions with the media.
I believe it is in the interest of all stakeholders, and the public record, for HP to respond to a number of questions: • Many observers are stunned by HP’s claim that these allegations account for a $5 billion write down and fail to understand how HP reaches that number. Please publish the calculations used to determine the $5 billion impairment charge. Please provide a breakdown of the relative contribution for revenue, cash flow, profit and write down in relation to: • The alleged “mischaracterization” of hardware that HP did not realize Autonomy sold, as I understand this would have no effect on annual top or bottom lines and a minor effect on gross margin within normal fluctuations and no impact on growth, assuming a steady state over the period; • The alleged “inappropriate acceleration of revenue recognition with value-added resellers” and the “[creation of] revenue where no end-user customer existed at the time of sale”, given their normal treatment under IFRS; and • The allegations of incorrect revenue recognition of long-term arrangements of hosted deals, again given the normal treatment under IFRS.
• In order to justify a $5 billion accounting write down, a significant amount of revenue must be involved. Please explain how such issues could possibly have gone undetected during the extensive acquisition due diligence process and HP’s financial oversight of Autonomy for a year from acquisition until October 2012 (a period during which all of the Autonomy finance reported to HP’s CFO Cathie Lesjak).
• Can HP really state that no part of the $5 billion write down was, or should be, attributed to HP’s operational and financial mismanagement of Autonomy since the acquisition? • How many people employed by Autonomy in September 2011 have left or resigned under the management of HP? • HP raised issues about the inclusion of hardware in Autonomy’s IDOL Product revenue, notwithstanding this being in accordance with proper IFRS accounting practice. Please confirm that Ms Whitman and other HP senior management were aware of Autonomy’s hardware sales before 2012. Did Autonomy, as part of HP, continue to sell third-party hardware of materially similar value after acquisition? Was this accounted for by HP and was this reported in the Autonomy segment of their accounts? • Were Ms Whitman and Ms Lesjak aware that Paul Curtis (HP’s Worldwide Director of Software Revenue Recognition), KPMG and Ernst & Young undertook in December 2011 detailed studies of Autonomy’s software revenue recognition with a view to optimising for US GAAP? • Why did HP senior management apparently wait six months to inform its shareholders of the possibility of a material event related to Autonomy? Hewlett Packard is an iconic technology company, which was historically admired and respected all over the world. Autonomy joined forces with HP with real hopes for the future and in the belief that together there was an opportunity to make HP great again. I have been truly saddened by the events of the past months, and am shocked and appalled by the events of the past week.
I believe it is in the best interests of all parties for this situation to be resolved as quickly as possible.
I am placing this letter in the public domain in the interests of complete transparency.
And here’s a copy of HP’s full statement, which we received from an HP spokesperson today.
HP has initiated an intense internal investigation into a series of accounting improprieties, disclosure failures and outright misrepresentations that occurred prior to HP’s acquisition of Autonomy. We believe we have uncovered extensive evidence of a willful effort on behalf of certain former Autonomy employees to inflate the underlying financial metrics of the company in order to mislead investors and potential buyers.
The matter is in the hands of the authorities, including the UK Serious Fraud Office, the US Securities and Exchange Commission’s Enforcement Division and the US Department of Justice, and we will defer to them as to how they wish to engage with Dr. Lynch. In addition, HP will take legal action against the parties involved at the appropriate time.
While Dr. Lynch is eager for a debate, we believe the legal process is the correct method in which to bring out the facts and take action on behalf of our shareholders. In that setting, we look forward to hearing Dr. Lynch and other former Autonomy employees answer questions under penalty of perjury.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,724 | 2,012 |
"Autonomy now a challenger for worst corporate acquisition ever | VentureBeat"
|
"https://venturebeat.com/2012/12/02/autonomy-worst-merger"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Autonomy now a challenger for worst corporate acquisition ever Share on Facebook Share on X Share on LinkedIn HP's failed acquisition of Autonomy is arguably even worse than the AOL-Time Warner merger in 2000. What's more, the company ignored a long list of warning signs about Autonomy ahead of the acquisition.
Reports continue to mount about just how horribly HP botched its acquisition of software company Autonomy.
HP wrote off $8.8 billion in November , $5 billion of which it attributed to accounting improprieties at Autonomy. That’s after writing down $8 billion in goodwill in August, 2012. HP has repeatedly insisted that it had no knowledge of the accounting problems prior to the $11.1 billion acquisition in August, 2011.
But HP’s executives and board ignored a long list of warning signs, a well-researched report from the Mercury News explains. And once it announced the deal, HP proceeded to close it in spite of a loud outcry, both within and without the company, as described in a detailed recap of the acquisition’s high and low points in the New York Times.
“Our concern was the organic growth that Autonomy was reporting was overstated,” said Dan Mahoney, research director at CFRA , in the Merc’s report. Mahoney’s firm, which focuses on companies where it believes publicly-reported financial results don’t accurately portray what’s going on. It began researching and issuing reports on Autonomy as early as 2007.
Others joined the chorus of negativity from 2009 to 2011, as detailed in the Merc’s story: Paul Morland of London brokerage Peel Hunt issued a warning in August, 2009, stating that its financial reports were “wrong and misleading” and that its growth was “exaggerated.” James Chanos of the hedge fund Kynikos Associates shorted Autonomy in 2010, explaining in August of that year that “the accounting was absolutely dreadful, a disaster.” J.P. Morgan analyst David Khan wrote critically about the company in September, 2010.
Deutsche Bank analyst Marc Geall wrote “things are looking bad” regarding the company’s sales, in March, 2011.
Autonomy, founded by Michael Lynch in 1996, makes search software that can help identify trends latent in unstructured data, such as customer-service reports and social media. The company’s technology has widely been praised for over a decade, but it has also been dogged by allegations that it’s difficult and expensive to integrate with corporations’ existing technology infrastructures.
HP’s acquisition was meant to shore up the company’s flagging fortunes in hardware sales, adding a high-margin software business to its tepid server business, and giving it purchase in the increasingly-trendy “big data” market, in which companies seek to mine large volumes of unstructured data for insights into their customers and markets.
But many started calling the wisdom of HP’s acquisition into question as soon as the company announced its plans. Perhaps most prominent was the voice of Catherine A. Lesjak, then HP’s acting CEO and currently its chief financial officer, according to the Times report. Lesjak spoke passionately against the merger to the board, with such forcefulness that she was reportedly fearful for her job. She also canceled a scheduled appearance to defend the deal.
The merger, the Times reports, ran into trouble almost immediately. “We tried really hard to make this work,” Lynch is quoted as saying. “Instead of doing it the Autonomy way, which is to sweep problems out of the way and move full steam ahead, we got bogged down in H.P. process.” The stock market punished HP for its planned deal almost immediately, dropping the share price precipitously and initiating a long downward slide from which it has never recovered.
HPQ data by YCharts Before a year was out, HP was forced to admit, in May, 2012, that Autonomy’s revenues had missed expectations. And then, in November, it dropped the bomb about writing off billions in value.
Former chief executive Léo Apotheker, who lost his job just one month after sealing the Autonomy deal, also said he was stunned to learn of the problems , and Lynch has also recently claimed innocence , as well as casting doubt on HP’s claims.
Even Whitman, who was on the board at the time of the merger and since assumed the CEO’s chair, has admitted disappointment over Autonomy’s performance and regret over the merger.
Now, according to Toni Sacconaghi, a Sanford Bernstein analyst quoted by the Times, the Autonomy deal may “go down as the worst, most value-destroying deal in the history of corporate America.” Although the deal was smaller than the previous contender, AOL’s acquisition of Time Warner in 2000, it’s arguably worse because it destroyed a larger percentage of the acquiring company’s value. AOL stock dropped almost 50 percent, from $300 billion to $159 billion, between January, 2000 and December, 2001.
But HP’s value has dropped from $61 billion at the time it announced the Autonomy acquisition to $25 billion today, a drop of almost 60 percent.
And even at a 10-year low, many analysts still think HP is too expensive.
Titanic stamp image: catwalker / Shutterstock.com VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,725 | 2,017 |
"AI cybersecurity startup Darktrace raises $75 million, now valued at $825 million | VentureBeat"
|
"https://venturebeat.com/2017/07/11/ai-cybersecurity-startup-darktrace-raises-75-million-now-valued-at-825-million"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI cybersecurity startup Darktrace raises $75 million, now valued at $825 million Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Darktrace , a cybersecurity platform that detects threats in real time using machine learning that adapts and learns over time, has raised $75 million in a series D round of funding led by Insight Venture Partners, with participation from Summit Partners, KKR, and TenEleven Ventures.
The latest raise comes exactly a year after the U.K. outfit announced a $64 million funding round, which included participation from Japanese telecom giant Softbank. Darktrace has raised $180 million in total since its inception, and the company now claims a post-funding valuation of $825 million.
Founded in 2013 by mathematicians from the University of Cambridge, England, Darktrace touts its “enterprise immune system,” which sits on a company’s network to create “unique behavioral models for every user and device.” This in turn enables the platform to leverage artificial intelligence (AI) to identify patterns, spot subtle changes, and thwart cyberattacks before they happen.
Cyber shortage The global cybersecurity workforce is expected to be short 1.8 million people by 2022, according to some reports , so automation and AI could prove pivotal in helping protect companies — especially when a growing number of attacks are themselves automated.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Indeed, we will likely see more “machine versus machine” battles in the online security realm, which is why AI-infused cybersecurity technology is ripe for investment. Just last month, Microsoft confirmed it was snapping up Hexadite to bring AI to Windows 10 enterprise security. Other notable companies operating in the space include Cylance, which raised $100 million last year.
Now with headquarters in both Cambridge (U.K.) and San Francisco, Darktrace claims it has 3,000 customers and says over the past year its workforce has doubled to 500 employees working across 24 global offices.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,726 | 2,018 |
"Deliveroo lost $284 million in 2018 amid bruising food delivery battle | VentureBeat"
|
"https://venturebeat.com/2019/10/02/deliveroo-lost-284-million-in-2018-amid-bruising-food-delivery-battle"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Deliveroo lost $284 million in 2018 amid bruising food delivery battle Share on Facebook Share on X Share on LinkedIn Bikers working for Deliveroo wait for their instructions at one of the first Deliveroo Editions in France kitchens on July 3, 2018.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Deliveroo’s revenues and losses grew rapidly in 2018, demonstrating both the soaring popularity of food delivery services and the financial fragility of these disruptive businesses.
London-based Deliveroo reported today that sales rose to $584 million in 2018, up 72% from $340 million in 2017. But losses also jumped to $284 million, up from $244 million.
The good news for Deliveroo is that the loss margin fell from 71% to 48%. The bad news is that loss is still dizzying and comes amid intensifying competition and some consolidation among rivals with immensely deep pockets.
For the moment, Deliveroo remains optimistic.
“Deliveroo is growing from strength-to-strength and expanding across our markets as more and more people want amazing food delivered straight to their door,” said Deliveroo CEO Will Shu in a statement. “We’re focused on our mission of becoming the definitive food company, and we’ve continued to invest heavily in expansion, technology and new products to meet this ambition.” Food delivery services have exploded in recent years by offering greater convenience to customers while allowing local restaurants to connect with new markets. But it largely remains unclear whether these can be sustainable businesses for the long run.
That poses a huge risk not only to investors who have pumped billions into these services, but also to the food service industry, which is broadly reorganizing itself to adapt and survive this sweeping transformation. If these delivery companies collapse before they can figure out the economics, then many restaurants will be left in the lurch.
U.S.-based Grubhub has managed to make money for several years, though its profit margin slipped from 14% in 2017 to 8% in 2018. But thanks to massive investments and aggressive pricing, Uber Eats has managed to surpass Grubhub in revenues.
Uber Eats is likely losing money , though we don’t know for sure since it’s not broken out in Uber’s financials.
Meanwhile, Netherlands-based Takeaway.com has been busy rolling up competition, including buying Germany’s Delivery Hero last year for $1 billion and then merging this summer with London-based Just Eat in a deal valued at $5.7 billion.
The rationale is to slash costs, with the goal of becoming profitable next year.
To fend off this competition, Deliveroo raised $575 million last May in a round led by Amazon , bringing its total raised to a staggering $1.5 billion in six years. That followed reports that Amazon had made repeated attempts to acquire the company. Still, Deliveroo also retreated from the German market this year in the face of cutthroat competition.
For now, Deliveroo said its growth last year was fueled by the addition of a marketplace that allows restaurants with their own delivery drivers to join the platform. Going forward, it’s hoping to expand its presence in the 13 markets where it currently operates. In 2019, Deliveroo has been adding 50 towns in just the U.K. It’s trying to compete for delivery personnel by offering free insurance.
The company is also doubling its investment in technology while launching new B2B services that help restaurants negotiate better prices for ingredients.
As investor tolerance for companies with no clear path to profitability wanes (at least for the moment), how quickly these strategies help Deliveroo narrow its losses could determine whether it can remain an independent company.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,727 | 2,021 |
"Survey finds 96% of execs are considering adopting 'defensive AI' against cyberattacks | VentureBeat"
|
"https://venturebeat.com/2021/04/08/survey-finds-96-of-execs-are-adopting-offensive-ai-against-cyberattacks"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Survey finds 96% of execs are considering adopting ‘defensive AI’ against cyberattacks Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
“Offensive AI” will enable cybercriminals to direct attacks against enterprises while flying under the radar of conventional, rules-based detection tools. That’s according to a new survey published by MIT Technology Review Insights and Darktrace, which found that more than half of business leaders believe security strategies based on human-led responses are failing.
The MIT and Darktrace report surveyed more than 300 C-level executives, directors, and managers worldwide to understand how they perceive the cyberthreats they’re up against. A high percentage of respondents (55%) said traditional security solutions can’t anticipate new AI-driven attacks, while 96% said they’re adopting “defensive AI” to remedy this. Here, “defensive AI” refers to self-learning algorithms that understand normal user, device, and system patterns in an organization and detect unusual activity without relying on historical data.
Sixty-eight percent of the executives surveyed expressed concern about attacks employing AI for impersonation and phishing, while a smaller majority said they’re worried about more effective ransomware (57%), misinformation and the undermining of data integrity (56%), and the disruption of remote workers by targeting home networks (53%). Of the respondents, 43% underlined the damaging potential of deepfakes , or media that takes a person in an existing image, audio recording, or video and replaces them with someone else’s likeness using AI.
As the report’s coauthors write, when offensive AI is thrown into the mix, “fake email” could become nearly indistinguishable from trusted contact messages. And with employees working remotely during the pandemic — without the security protocols of the office — organizations have seen successful phishing attempts skyrocket. Google registered over 2 million phishing websites since the start of 2020, when the pandemic began — a 19.91% increase compared with 2019.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Businesses are increasingly placing their faith in defensive AI to combat the growing cyberthreats. Known as an autonomous response, defensive AI can interrupt in-progress attacks without affecting day-to-day business. For example, given a strain of ransomware an enterprise hasn’t encountered in the past, defensive AI can identify the novel and abnormal patterns of behavior and stop the ransomware even if it isn’t associated with publicly known compromise indicators (e.g., blacklisted command-and-control domains or malware file hashes).
According to the survey, 44% of executives are assessing AI-enabled security systems and 38% are deploying autonomous response technology. This agrees with findings from Statista. In a 2019 analysis, the firm reported that around 80% of executives in the telecommunications industry believe their organization wouldn’t be able to respond to cyberattacks without AI.
Reflecting the pace of adoption, the AI in cybersecurity market will reach $38.2 billion in value by 2026, Markets and Markets projects.
That’s up from $8.8 billion in 2019, representing a compound annual growth rate of around 23.3%.
“With the onset of AI-powered attacks, organizations need to reform their strategies quickly, be prepared to defend their digital assets with AI, and regain the advantage over this new wave of sophisticated attacks,” the report’s coauthors wrote. “By automating the process of threat detection, investigation, and response, AI augments human IT security teams by stopping threats as soon as they emerge, so people have the time to focus on more strategic tasks at hand.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,728 | 2,020 |
"Canalys: Cloud spending hit record $31 billion in Q1 2020, but growth continues to slow | VentureBeat"
|
"https://venturebeat.com/2020/05/01/canalys-cloud-spending-hit-record-31-billion-in-q1-2020-but-growth-continues-to-slow"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Canalys: Cloud spending hit record $31 billion in Q1 2020, but growth continues to slow Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Companies spent a record $31 billion on cloud infrastructure services in Q1 2020, though the full impact of COVID-19 is unlikely to be realized until the second quarter.
New figures from Canalys indicate cloud spending grew 34.5% year-on-year (YoY) from the $23.1 billion for the corresponding period last year. However, the cloud spending increase was an established trend, with Q1 2019 figures revealing a 39.3% YoY increase and Q4 2019 showing a 37.2% YoY increase. These numbers also show that while the overall dollar spend continues to rise, the rate of growth is slowing.
The numbers indicate that cloud spending only grew 2.6% quarter-on-quarter through the end of March 2020, or around $800 million in real terms.
Above: Canalys: Cloud spending: Q1 2020 Moreover, while Canalys attributes the growth in cloud infrastructure services spend to the sudden shift to remote working as COVID-19 hit, much of the global workforce didn’t begin working from home until March. That said, China — the second biggest cloud services market after the U.S. — embraced remote working earlier as it was first to feel the effects of the pandemic.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! It’s no secret that consumers and businesses have been flocking to cloud-based services and remote working tools. Netflix doubled its expected signups for new customers for the last quarter, while Microsoft announced that it grew its daily active user base for Teams by 70% to 75 million.
Videoconferencing platform Zoom, meanwhile, saw its user base jump from 10 million in December to more than 200 million in March, and Google Meet passed 100 million daily active users — with 3 million new users joining daily.
But while cloud usage has certainly been up, service providers have experienced a downside — large enterprise projects such as SAP migrations, hybrid cloud deployments, and “other transformational projects” have been put on the back burner. This conservative approach to new spending may counter some of the growth seen elsewhere in the cloud services realm. Indeed, some of the industries hit worst by the global pandemic — such as hospitality, tourism, and construction — have cut or delayed planned cloud spending.
“We saw an unprecedented surge in demand and use of cloud-based applications primarily driven by remote working — not just collaboration tools, but also cloud security, which was also added to by increased ecommerce and other online activity,” Canalys chief analyst Matthew Ball told VentureBeat. “On the reverse, we saw an immediate slowdown of large enterprise consultative-led projects.” In the cloud All the major cloud service companies released their Q1 2020 figures this week, with global leader Amazon’s AWS passing the $10 billion milestone for the first time, a 33% YoY increase — however, its growth is also slowing. In Q2 2019 , AWS saw its first sub-40% growth since Amazon began reporting AWS figures — growth dropped to 37%, followed by 35% in Q3 2019 and 34% in Q4 2019.
Elsewhere, Microsoft reported a 59% YoY revenue increase for Azure in the last quarter, compared to 62% for the preceding quarter and 73% for the corresponding period last year.
And while Google only recently started breaking out its Cloud figures, we know its revenues in Q1 2020 were up 52% YoY to $2.78 billion , compared to the 53% YoY rise it reported in Q4 2019.
It’s difficult to read too much into the impact COVID-19 has had on the cloud services and infrastructure market so far — data for the next quarter should be much more revealing.
What is clear, however, is that cloud services will likely only grow in demand if remote working continues in the future. All the major providers were already investing heavily in their cloud infrastructure, with Amazon recently opening its first African datacenters , in addition to a new region in Italy , while Google added a new region in Las Vegas. Alibaba, meanwhile, revealed plans to invest $28 billion in cloud infrastructure over the next three years after a surge in uptake of its various services during the COVID-19 outbreak led to service issues.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,729 | 2,020 |
"Slice raises $43 million to help pizzerias go online | VentureBeat"
|
"https://venturebeat.com/2020/05/12/slice-raises-43-million-to-help-pizzerias-go-online"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Slice raises $43 million to help pizzerias go online Share on Facebook Share on X Share on LinkedIn A Slice pizzeria Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Slice , an online ordering platform designed to help small pizzerias compete with large chains, has raised $43 million in a series C round of funding led by KKR.
The cash injection comes as U.S. states continue to ease lockdown measures that were introduced in March to combat the COVID-19 crisis — while expecting brick-and-mortar stores to adapt their operations to a “new normal.” This partial lifting of restrictions emphasizes social distancing through increased support for deliveries and pickups and opens the door to platforms that help businesses go online.
Founded out of New York in 2009, Slice provides all the technological infrastructure, payment systems, customer service, and marketing to help restaurants capitalize on the $46 billion U.S. pizza market. While Slice offers its own branded app that lets customers search for pizzerias in their area, it also helps outlets set up individual websites to process orders.
A dedicated dashboard gives restaurants data about all their orders, with additional tools for managing promo codes, tweaking menus, setting opening hours, and so on.
Above: Slice restaurants’ own dashboard The company charges a set $2.25 commission plus 4% processing fee per order, compared with the 20% or more other ordering platforms may take. This can work out to quite a substantial difference if the average price of an order is over $30. Slice also makes money from additional add-on services, including a new Slice Delivery service that’s available in beta just now for some outlets that currently offer pickup only.
“Because of COVID-19, the restaurant landscape is rapidly evolving,” a Slice spokesperson told VentureBeat. “Local pizzerias with delivery are seeing 3 times the orders than pickup-only shops. We’re looking to roll this [Slice Delivery] out broadly in the very near future.” Slice had previously raised around $30 million, including a $15 million tranche a year ago. According to the company’s founder and CEO Ilir Sela, Slice has seen a “significant surge in demand” across its platform due to the COVID-19 crisis. As restrictions shift in many states, Slice is well-financed to target pizzerias eager to get business back up and running. Indeed, Slice currently claims more than 12,000 pizzeria customers across the U.S., and it plans to expand its coverage to more restaurants domestically and internationally.
Going online Social distancing measures enforced by the lockdown have led to a sizable uptick in online purchasing, and it’s unclear whether this trend will continue once life eventually returns to normal. However, there’s no guarantee when that will be, and sit-down restaurants could remain closed or heavily restricted for quite a while.
Uber Eats has sought to capitalize on this situation through a number of new initiatives, including enabling telephone orders for those without a smartphone, expanding support for corporate customers , and making it easier for people to place orders for friends and family. Meanwhile, payment processing giant Square last week launched a new online checkout product that makes it easy for anyone to accept payments online, and Google is now enabling retailers to advertise if they offer curbside pickups.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,730 | 2,020 |
"Canalys: Cloud infrastructure spending grew 32% to $39.9 billion in Q4 2020 | VentureBeat"
|
"https://venturebeat.com/2021/02/03/canalys-cloud-infrastructure-spending-grew-32-to-39-billion-in-q4-2020"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Canalys: Cloud infrastructure spending grew 32% to $39.9 billion in Q4 2020 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The global cloud services infrastructure market grew to $39.9 billion in Q4 2020, a 32% year-on-year (YoY) increase and 10% quarter-on-quarter (QoQ) increase.
The figures from Canalys come in a seven-day period when the “big three” cloud providers revealed their quarterly earnings, with Amazon’s AWS , Microsoft’s Azure , and Alphabet’s Google Cloud reporting record sales, though growth slowed in some cases.
While cloud services infrastructure was already a major growth industry, the pandemic has accelerated this upward trend over the past year , with each quarter recording sharp inclines. The last three-month period once again represented the “largest quarterly expansion in dollar terms,” according to Canalys, driven by remote work and consumer services such as online gaming and music streaming.
Looking at the full calendar year, Canalys’ figures show that cloud spending hit $142 billion, up one-third from $107 billion the year before.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! This growth has hugely benefited companies that operate on the cloud, including virtual video conferencing and remote collaboration tools like Zoom that have been on a tear.
Up front Amazon’s AWS continues to lead the field, with revenues hitting $12.7 billion in Q4, up 28% on the corresponding period a year earlier. Compared to its 34% growth in Q4 2019, things aren’t moving as quickly as they once did, but analyst Patrick Moorhead says Amazon still “had a great quarter,” even if its growth is stalling.
“It’s important to note that it grew more in one quarter, $2.78 billion, larger than the entire annual revenue of many cloud plays,” Moorhead said. “AWS is well on its way to creating an annualized, $50 billion revenue company. This makes AWS larger than Salesforce.com and SAP combined. Equally impressive is that AWS delivered over half of the company’s operating profit.” Above: Q4 2020 cloud infrastructure services spend AWS now commands 32% of total cloud infrastructure spend, compared to 20% for Microsoft Azure, 7% for Google Cloud, and 6% for Alibaba.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,731 | 2,021 |
"Gartner predicts public cloud spending to reach $332B in 2021 | VentureBeat"
|
"https://venturebeat.com/2021/04/21/gartner-predicts-public-cloud-spending-to-reach-332b-in-2021"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Gartner predicts public cloud spending to reach $332B in 2021 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Worldwide spending on public cloud services is expected to reach $332.3 billion in 2021, an increase of 23.1% from 2020, according to the latest Gartner forecast. Public cloud spending in 2020 was $270 billion.
Cloud spending is getting a boost because emerging technologies such as containerization, virtualization, and edge computing are becoming more mainstream, Sid Nag, research vice president at Gartner, said in the forecast. Software-as-a-service (SaaS) remains the largest market segment and is forecast to reach $122.6 billion in 2021, spurred by the demand for composable applications.
“The events of last year allowed CIOs to overcome any reluctance of moving mission critical workloads from on-premises to the cloud,” Nag said. There was an expectation that some workloads had to stay on-premises, but the past year showed those concerns were unfounded. But even if there hadn’t been a pandemic , Nag said there was a “loss of appetite” for datacenters.
As organizations mobilize for a massive global effort to produce and distribute COVID-19 vaccinations, SaaS-based applications that enable essential tasks such as automation and supply chain are critical. The fact that these applications demonstrate reliability in scaling vaccine management will help CIOs validate the ongoing shift to cloud, Gartner said.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Growth across all areas Desktop-as-a-service (DaaS) will see the highest growth in 2021, growing 67.7% to reach $2 billion, followed by infrastructure-as-a-service (IaaS) at 38.5% to reach $82 billion. CIOs will boost IaaS and DaaS spending as they continue to face pressure to scale infrastructure that supports complex workloads and meets the demands of a hybrid workforce, Gartner said. Growth in IaaS and DaaS will slow down 30% in 2022, with spending to reach $106.8 billion and $2.7 billion, respectively.
Cloud spending will grow across all areas in 2021, Gartner said in its forecast. Business process services (business-product-as-a-service, or BPaaS) is forecast to reach $50.1 billion, and application infrastructure services (platform-as-a-service, or PaaS) will reach $59.4 billion. Cloud management and security services will reach $16 billion.
Cloud drives innovation The public cloud market fared well over the past year as enterprises relied on cloud services to maintain business continuity in light of business disruptions and shift to a remote workforce. However, cloud spending will look different in 2021 and 2022 as enterprises shift away from infrastructure and application migration and toward innovative applications combining cloud with technologies such as AI, internet of things, and 5G.
“Cloud will serve as the glue between many other technologies that CIOs want to use more of, allowing them to leapfrog into the next century as they address more complex and emerging use cases,” Nag said.
Companies depend on the cloud to adopt emerging technologies at scale, and the resulting applications and workloads encourage more cloud spending. For example, a company building an IoT application could use virtual machines and containers to build at scale, and then buy more cloud services to manage the data that is generated.
Gartner is not the only one noting that emerging technologies such as edge computing is growing rapidly. International Data Corporation (IDC) said the worldwide edge computing market is expected to reach $250.6 billion in 2024. The services market should account for 46.2% of all edge spending by 2024, followed by hardware at 32.2% and edge software at 21.6%, IDC said.
Earlier this month, Gartner predicted worldwide IT spending will reach $3.8 trillion in 2021, an increase of 4% from 2020. The biggest growth is expected in enterprise software, fueled by enterprises accelerating their digital transformation plans to deliver virtual services such as distance learning, telehealth, and automation. Other areas of spending include datacenter systems, communications, IT services, and devices.
PC spending is also up this year.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,732 | 2,021 |
"Microsoft beats Q3 revenue expectations, spurred by strong cloud sales | VentureBeat"
|
"https://venturebeat.com/2021/04/27/microsoft-beats-q3-revenue-expectations-spurred-by-strong-cloud-sales"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft beats Q3 revenue expectations, spurred by strong cloud sales Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
( Reuters ) — Microsoft on Tuesday met analysts’ quarterly sales expectations and beat profit estimates, but its shares fell slightly as investors hoped for an even stronger performance after a year-long rally to a massive market valuation.
The Redmond, Washington company has become one of the world’s most valuable companies, worth close to $2 trillion after its stock jumped 50% over the past year, by entering the booming market for cloud computing.
Microsoft has remained a household name during the pandemic through its Teams collaboration software. Sales have even boomed for its Windows operating system for PCs, which had waned for decades as smartphone have proliferated.
Microsoft’s Azure cloud service is closing ground on market-share leader Amazon Web Services, and it is doubling down on productivity software used by businesses worldwide.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Revenue and adjusted earnings per share for the third quarter ended March 31 were $41.7 billion and $1.95 per share, above analysts’ estimates of $41.03 billion and $1.78 per share, according to data from Refinitiv.
Shares initially fell as much as 3.2% after the results were released, but they pared losses to 1.7%, at $257.50, after Microsoft executives gave a better-than-expected forecast during a conference call with investors.
“One-off tax and currency advantages have boosted Microsoft’s third-quarter numbers, and as a result the market isn’t being quite as welcoming of expectation-beating numbers as you might expect,” said Nicholas Hyett, equity analyst at Hargreaves Lansdown.
“That is the danger of trading on the kind of valuation Microsoft enjoys, 32.8 times next year’s earnings. Disappoint even a little and the market will be unforgiving.” Sales for what Microsoft calls its “commercial cloud” – which contains server infrastructure such as Azure along with cloud-based versions of its Office software – was up 33% at $17.7 billion. Sales for Dynamics 365, which competes directly with Salesforce.com, rose 45% and the business version of Office 365 added 15% more users.
“That’s the fourth consecutive quarter of 15% seat growth on a very large base,” Microsoft Chief Financial Officer Amy Hood said of the Office 365 results for commercial customers.
Microsoft has continued to double down on cloud-base software and said earlier this month it would buy artificial intelligence software firm Nuance Communications Inc for $16 billion, excluding net debt, to bolster its healthcare business.
Microsoft said Azure, its closely watched cloud computing business that competes with Amazon.com Inc’s Amazon Web Services and Alphabet’s Google Cloud, grew 50% in the quarter, or 46% when adjusted for currency variations. This is down from a currency-adjusted 48% the quarter before but in line with analysts’ expectations of 46.3% growth, according to data from Visible Alpha.
Overall sales at Microsoft’s “intelligent cloud” unit that contains Azure were $15.1 billion, above analysts’ estimates of $14.92 billion, according to Refinitiv data.
Microsoft Teams has 145 million daily users, up from 115 million in October, Microsoft said. Sales for Microsoft’s productivity software unit, which includes Office and Teams, were $13.6 billion, compared with estimates of $13.49 billion, according to Refinitiv.
Sales for its LinkedIn social network were up 23% on a currency adjusted basis, slightly above Visible Alpha estimates of 21.9%, as revenue continued to recover from a sharp decline in job listings and hiring at the onset of the pandemic.
Microsoft’s personal computing unit, which contains its Windows operating system and Xbox gaming console, had $13.0 billion in sales, compared with analysts’ expectations of $12.57 billion, according to Refinitiv data. Sales of Windows to PC makers were up 10%, compared to a 1% rise the quarter earlier.
On a call with investors, Microsoft forecast fiscal fourth-quarter productivity segment revenue with a midpoint of $13.93 billion, above Refinitiv estimates of $13.57 billion. Its sales forecasts for its intelligent cloud and personal computing businesses had midpoints of $16.32 billion and $13.80 billion, respectively, above estimates of $16.0 billion and $13.26 billion, according to Refinitiv data.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,733 | 2,021 |
"AI-powered construction project platform OpenSpace nabs $55M | VentureBeat"
|
"https://venturebeat.com/2021/04/28/ai-powered-construction-project-platform-openspace-nabs-55m"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI-powered construction project platform OpenSpace nabs $55M Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
OpenSpace , a platform that helps construction companies track building projects through AI-powered analytics and 360-degree photo documentation, has raised $55 million in a series C round of funding led by Alkeon Capital Management.
The raise comes amid a cross-industry digital transformation boom, spurred in large part by the pandemic. Construction has often lagged behind other sectors in terms of efficiency , but tech such as robotics, artificial intelligence (AI), and remote collaboration tools has helped get the $11 trillion industry back on track.
Visibility Founded out of San Francisco in 2017, OpenSpace leans on AI to create 360-degree photos of construction sites that are captured by builders or site managers who traverse an area with cameras strapped to their hats. All the imagery is sent to the cloud, where computer vision and machine intelligence tools arrange, stitch, and map the capture visuals to the associated project plans. It’s all about documenting activities on each site so stakeholders can check on progress remotely or resolve discrepancies by reviewing a visual history of the project.
Above: OpenSpace: site comparisons OpenSpace also offers AI-powered analytics, including a “progress tracking” feature that uses computer vision to analyze site images and automatically figure out how much of the scheduled work has been completed. Elsewhere, “object search” enables site managers to select an object from a scene and find similar objects elsewhere on the site.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Above: OpenSpace: object search Other notable players in the space include SiteAware and Buildots , both of which have raised sizable VC investments over the past nine months , highlighting the growing demand for digital technologies in the construction space.
OpenSpace had previously raised around $34 million, including a $15.9 million series B round last July.
With its latest cash injection, the company is ready to capitalize on its rapid growth over the past year, when it claims revenue tripled and its customer count rose by 150%. With another $55 million in the bank, OpenSpace said it will double down on its suite of analytics products and expand into areas such as safety management and quality control.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,734 | 2,021 |
"Amazon posts record profits as AWS hits $54B annual run rate | VentureBeat"
|
"https://venturebeat.com/2021/04/30/amazon-posts-record-profits-as-aws-hits-54b-annual-run-rate"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon posts record profits as AWS hits $54B annual run rate Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
( Reuters ) — Amazon.com, one of the biggest winners of the pandemic, posted record profits on Thursday and signaled that consumers would keep spending in a growing U.S. economy and converts to online shopping are not likely to leave.
Since the start of the coronavirus outbreak, shoppers have relied increasingly on Amazon for delivery of home staples, and the company sees this trend continuing post-pandemic, particularly for groceries.
While brick-and-mortar stores closed, Amazon has now posted four consecutive record quarterly profits, attracted more than 200 million Prime loyalty subscribers, and recruited over 500,000 employees to keep up with surging demand.
Amazon said it expects operating income for the current quarter to be between $4.5 billion and $8 billion, which includes about $1.5 billion in costs related to COVID-19.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Shares rose 4% in after-hours trade.
Throughout the pandemic, the world’s largest online retailer has been at the center of workplace tumult, with a failed attempt by organized labor to unionize an Amazon warehouse in Alabama and litigation in New York over whether it put profit ahead of employee safety.
Amazon’s business has largely been unfazed by the developments. Michael Pachter, an analyst at Wedbush Securities, said a jump in Prime subscriptions, consumers’ embrace of grocery delivery amid COVID-19 and an improving economy worked to Amazon’s advantage.
“Habit. Good quality grocery. Stimulus checks,” Pachter said. “They’re going to thrive.” Slower sales growth in the current period relative to the last quarter reflected a tougher comparison to last year, when lockdowns were in full swing, Pachter said.
CEO Jeff Bezos touted the results of the company’s cloud computing unit Amazon Web Services (AWS) in a press release, saying, “In just 15 years, AWS has become a $54 billion annual sales run rate business competing against the world’s largest technology companies, and its growth is accelerating.” The plaudits were a nod to Andy Jassy, AWS’s long-time cloud chief who will succeed Bezos as Amazon’s CEO this summer. Amazon announced a deal for Dish Network to build its 5G network on AWS last week, and the division increased revenue 32% to $13.5 billion, ahead of analysts’ average estimate of $13.2 billion, according to IBES data from Refinitiv.
Brian Olsavsky, Amazon’s chief financial officer, said businesses increasingly wanted to outsource their technology infrastructure to AWS.
“We expect this trend to continue as we move into the post-pandemic recovery,” he said.
Adding to Amazon’s second-quarter revenue will be Prime Day, the company’s annual marketing blitz. Amazon disclosed the event will take place in June rather than July, as is more typical, to reach customers before they head on vacation.
Grocery sales anchored by Amazon’s subsidiary Whole Foods Market remain a bright spot, too. Olsavsky called grocery “a great revelation during the post-pandemic period.” The company’s first-quarter profit more than tripled to $8.1 billion from a year ago, on sales of $108.5 billion, ahead of analysts’ estimates.
Ad sales growth Amazon saw its stock price nearly double in the first part of 2020 as it benefited from the pandemic. This year, however, it has underperformed the S&P 500 .SPX market index. Its shares were up about 8.5% year to date versus the index’s 13% gain.
Spending on COVID-19 and logistics has chipped away at Amazon’s bottom line. The company has poured money into buying cargo planes and securing new warehouses, aiming to place items closer to customers to speed up delivery. It said Wednesday it planned to hike pay for over half a million employees, costing more than $1 billion — and it is still hiring for tens of thousands more positions.
Olsavsky said Amazon was still working to restore one-day package delivery rates to pre-pandemic levels.
He told reporters the company intends to increase spending on video content this year as well. Consumers have been watching content for more hours on Amazon, Olsavsky said.
While far behind ad sales leaders Facebook Inc and Alphabet Inc’s Google, Amazon is growing its ad business because brands’ placements often result directly in sales, reaching customers who are on Amazon with an intention to shop.
Jesse Cohen, senior analyst at Investing.com, said, “Outside of its core retail and cloud units, advertising revenue is increasingly becoming another substantial growth driver for Amazon.” Amazon said ad and other sales rose 77% to $6.9 billion, ahead of analysts’ estimate of $6.2 billion.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,735 | 2,020 |
"Amazon launches its first African AWS datacenters as demand for cloud services surges | VentureBeat"
|
"https://venturebeat.com/2020/04/22/amazon-launches-its-first-african-aws-datacenters-as-demand-for-cloud-services-surges"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon launches its first African AWS datacenters as demand for cloud services surges Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Amazon has officially opened its first datacenters in Africa, with its Amazon Web Services (AWS) cloud division launching three new availability zones in Cape Town , South Africa.
While Amazon first announced this new cloud region back in 2018, the timing of the launch is notable.
Demand for cloud services has surged due to the COVID-19 crisis, with businesses embracing bandwidth-intensive remote working options like video-conferencing and individuals on lockdown consuming more internet-based services.
AWS represents over 10% of Amazon’s revenue, having drawn in nearly $10 billion during the last quarter — roughly 4 times more than Google’s Cloud division, though that includes income from G Suite. Microsoft doesn’t break out its Azure earnings specifically, but the company was the first of the major public cloud providers to open datacenters in Africa, launching two regions in Johannesburg and Cape Town last March , while Google has yet to confirm plans for the region.
Having local infrastructure in Africa could alleviate a little stress from AWS datacenters elsewhere in the world, though the main benefit for Amazon is that it’s now in a better position to appeal to African companies. Putting datacenters closer to customers improves data-transfer speeds and reduces latency.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Today’s news comes after Alibaba revealed plans to invest $28 billion in cloud infrastructure over the next three years, following a surge in usage of its various services during the COVID-19 outbreak. Indeed, many users complained that the company’s workplace chat app, DingTalk, was regularly lagging due to the sudden increase in remote workers.
Amazon now offers cloud infrastructure in 73 zones in 23 regions , and with today’s news the only continent without an AWS region is Antarctica.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,736 | 2,021 |
"AI Weekly: What Andy Jassy's ascension to CEO means for Amazon's AI initiatives | VentureBeat"
|
"https://venturebeat.com/2021/02/05/ai-weekly-what-andy-jassys-ascension-to-ceo-means-for-amazons-ai-initiatives"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI Weekly: What Andy Jassy’s ascension to CEO means for Amazon’s AI initiatives Share on Facebook Share on X Share on LinkedIn AWS CEO Andy Jassy speaks at the company's re:Invent customer conference in Las Vegas on November 29, 2017.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
This week, Jeff Bezos announced that he’ll step down as CEO at Amazon and transition to an executive chair role during the third quarter of this year. Amazon Web Services (AWS) CEO Andy Jassy will take his place, heading up a company currently valued at around $1.6 trillion.
Jassy, who joined Amazon in 1997 and has led AWS since its inception in 2003, believes Amazon’s decision to double down on AI early differentiated it from the competition. In May 2020, Gartner ranked AWS the industry leader in terms of vision and ability to execute on AI developer services. Beyond AWS, product recommendations on Amazon are powered by AI, as well as Alexa, Prime Air, Amazon Go, and the pick paths used in distribution centers to find products and fulfill orders.
So how might Jassy’s elevation to CEO impact Amazon’s AI initiatives? Interviews in recent years suggest Jassy is enthusiastic about cloud services tailored to the needs of machine learning practitioners, particularly for large enterprise applications. Controversially, Jassy has also said customers, not Amazon itself, are responsible for curbing their usage of potentially problematic AI technologies like facial recognition.
In a conversation with Silicon Angle in December, Jassy said he expects the majority of applications to be infused with AI in the next five to 10 years. While he endorses the idea of catering to expert machine learning practitioners who know how to train, tune, and deploy AI models, he asserts that AWS, more than rivals like Google Cloud Platform and Microsoft Azure, has aimed to “democratize” data science by lowering the barriers to entry.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “There just aren’t that many expert machine learning practitioners. And so it never gets extensive in most enterprises if you don’t make it easier for everyday developers and data scientists to use machine learning,” Jassy told Silicon Angle. He stressed the importance of “top-layer” AI services that transcribe audio, translate text, and more via APIs, without requiring customers to develop custom models. But he said the most important thing Amazon has done to make AI more accessible is building fully managed services.
“Enterprises have so much data that they want to use predictive algorithms to get value added,” Jassy said during a keynote at the Goldman Sachs Technology and Internet Conference in San Francisco last February.
SageMaker is one example of these fully managed services. Launched in 2016, it’s designed to let developers build, test, and maintain AI models from a single dashboard. Amazon says SageMaker, which gained nine new capabilities in December — following the launch of SageMaker Studio , an integrated development environment for machine learning — now has tens of thousands of customers.
It’s a safe bet that investments in services akin and complementary to SageMaker will accelerate with Jassy at the helm. So too, most likely, will the buildout of backend tools Amazon uses to solve challenges like call analytics.
“As Clay Christensen, author of The Innovator’s Dilemma , said … people hire products and services to do a job. They don’t really care what you do under the covers, but they’re hiring a product to do a job,” Jassy told Silicon Angle. “[Some people] don’t actually want to hire machine learning [experts]. They want to have an easier way to get automatic call analytics on all of their calls … And what we’re finding is that increasingly we’re using machine learning as the source to get those jobs done, but without people necessarily knowing that that’s what we’re doing behind the scenes.” This work might be of a controversial nature. In an interview at Recode’s 2019 Code Conference , Jassy defended the company’s facial recognition service, Rekognition, while calling for the federal government to introduce national guidelines. (In September 2019, Recode reported that Amazon was writing its own facial recognition laws to pitch to lawmakers.) “Just because tech could be misused doesn’t mean we should ban it and condemn it,” he said, adding that Amazon would provide its facial recognition tech to governments, excepting those that violate the law or infringe on civil liberties.
Last year, Amazon declared a halt on the sale of facial recognition to police departments for 12 months but did not necessarily extend that restriction to federal law enforcement agencies. Prior to the moratorium, the company reportedly attempted to sell its facial recognition tech to U.S. Immigration and Customs Enforcement (ICE), and police in Orlando, Florida and other cities have trialed it.
A number of academics have called Jassy’s stance on facial recognition technology, which runs counter to that of many Amazon shareholders , problematic at best. Anima Anandkumar, the principal scientist for artificial intelligence at Amazon, told PBS Frontline that facial recognition isn’t “ battle-tested ” to work in the types of challenging conditions where law enforcement might use it (e.g., with low-light, grainy, or low-quality images). And dating back to 2018, AI researchers Joy Buolamwini, Timnit Gebru, and Deborah Raji have found that facial recognition software from companies like Amazon work best for white men and worst for women with dark skin. Amazon has publicly dismissed their coauthored work, the Gender Shades project.
Given this history, it seems unlikely that Jassy will extend the moratorium on facial recognition sales when it expires in July. He’s also unlikely to curtail the law enforcement relationships that Ring, Amazon’s smart home division, has fostered since its acquisition by Amazon in 2018. Ring has reportedly partnered with over 2,000 police and fire departments across the U.S. dating back to 2015, when Ring let the Los Angeles Police Department test how front-door footage might reduce property crimes.
Advocacy groups like Fight for the Future and the Electronic Frontier Foundation have accused Ring of using its cameras and Neighbors app (which delivers safety alerts) to build a private surveillance network via these partnerships. The Electronic Frontier Foundation in particular has singled Ring out for marketing strategies that foster fear and promote a sale-spurring “vicious cycle,” and for “[facilitating] reporting of so-called ‘suspicious’ behavior that really amounts to racial profiling.” “We don’t have a large number of police departments that are using our facial recognition technology, and as I said, we’ve never received any complaints of misuse. Let’s see if somehow they abuse the technology — they haven’t done that,” Jassy told PBS Frontline in a 2020 interview. “And to assume they’re going to do that and therefore you shouldn’t allow them to have access to the most sophisticated technology out there doesn’t feel like the right balance to me.” For AI coverage, send news tips to Khari Johnson and Kyle Wiggers — and be sure to subscribe to the AI Weekly newsletter and bookmark The Machine.
Thanks for reading, Kyle Wiggers AI Staff Writer VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,737 | 2,019 |
"Alphabet X lab spinoff Dandelion raises $16 million for home geothermal systems | VentureBeat"
|
"https://venturebeat.com/2019/02/12/alphabet-x-lab-spinoff-dandelion-raises-16-million-for-home-geothermal-systems"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Alphabet X lab spinoff Dandelion raises $16 million for home geothermal systems Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Google parent company Alphabet’s secretive X lab has birthed many a spinoff, the latest of which is energy company Malta.
X lab is also responsible for Project Wing , the drone delivery service designed to improve access to goods, and Loon, a balloon network that beams internet service to remote areas — not to mention driverless car company Waymo , life sciences company Verily , and cybersecurity intelligence offshoot Chronicle.
Dandelion is yet another graduate of X, where it was incubated for two years, and the New York-based startup is no less ambitious than its cohorts. It aims to cut customers’ electricity bills in half with a geothermal, environmentally friendly heating and cooling system — the aptly named Dandelion Home Geothermal System — that operates beneath buildings. Now, after installing systems in Upstate New York during a six-month pilot and selling 70 systems in existing homes in 2017, it’s scaling operations with a new round of funding.
Dandelion today announced that it has secured $16 million in financing from GV (formerly Google Ventures) and Comcast Ventures, with participation from Lennar Corporation and previous investors NEA, Collaborative Fund, Ground Up, and ZhenFund, among others. Former Googler and Dandelion CEO Kathy Hannun says the influx of cash — which brings the company’s total capital raised to $23 million — will fuel growth and R&D, along with allowing it to open new warehouses across New York State.
As part of the round, GV partner Shaun Maguire and Comcast Ventures managing director Sam Landman will join Dandelion’s board.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “In partnership with world-class investors, including GV, Comcast Ventures, and Lennar Corporation, the nation’s leading home builder, as well as with the continued support of our existing investors, we’re eager to continue to advance our mission of enabling the widespread adoption of geothermal,” Hannun said.
The series A comes after Dandelion’s March 2018 acquisition of Geo-Connections, a geothermal software-as-a-service company that develops the underlying tech used in “tens of thousands” of geothermal systems across the U.S. And it precedes the broad launch of Dandelion’s second product — Dandelion Radiate — which works with traditional radiators. Dandelion says “thousands” of people are on its waitlist and that it is New York’s largest residential geothermal installer by volume.
Like most geothermal solutions, the Home Geothermal System taps the ground’s energy with plastic pipes and pumps, the latter of which are installed in-home. The systems move heat from the ground into the house when it’s cold out, and vice versa during hot spells.
But unlike standard systems, which are typically custom-built, Dandelion Air is designed to work in abodes of all sizes and shapes. It’s also packed with a self-diagnostic processing and sensor package that autonomously checks to ensure units are correctly installed at runtime and monitors various performance metrics to identify issues that might arise.
Dandelion claims its system is 4 times more efficient than furnaces and almost twice as efficient as traditional air conditioning systems.
That’s not the only thing that sets it apart. Installation is typically the trickiest part of geothermal deployment — most companies use wide drills designed to dig water wells at depths of over 1,000 feet. Dandelion, by contrast, uses a custom rig that bores smaller holes a few inches in diameter at shallower depths, enabling installers to complete jobs in less than a day.
Still, the upfront costs are substantial. Excepting monthly financing, these costs run between $20,000 and $25,000 — and that’s not including the cost of installing vents in homes that lack them. (An included Nest Learning Thermostat softens the blow, but not by much.) For those willing and able to fork over the cash, Dandelion says the savings (with tax credits) can be between 20 percent and 70 percent a year, which works out to roughly $135 a month over 20 years ($32,400).
There’s plenty of additional incentive. Greenhouse gas emissions from electricity consumption — to which air conditioners contribute substantially — are forecast to reach 2.28 billion tons in 2050, while heating fuels currently generate 11 percent of all emissions related to the heating of homes and businesses in the U.S., according to the Environmental Protection Agency.
“The home heating and cooling industry has been constrained by lack of innovation and high costs,” said Landman. “The team at Dandelion and their modern approach to implementing geothermal technology is transforming the industry and giving consumers a convenient, safe, and cost-effective way to heat and cool their homes while reducing carbon emissions.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,738 | 2,020 |
"How utilities are using AI to adapt to electricity demands | VentureBeat"
|
"https://venturebeat.com/2020/04/20/utilities-energy-usage-covid-19-ai-machine-learning"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How utilities are using AI to adapt to electricity demands Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The spread of the novel coronavirus that causes COVID-19 has prompted state and local governments around the U.S. to institute shelter-in-place orders and business closures. As millions suddenly find themselves confined to their homes, the shift has strained not only internet service providers , streaming platforms , and online retailers , but the utilities supplying power to the nation’s electrical grid, as well.
U.S. electricity use on March 27, 2020 was 3% lower than it was on March 27, 2019, a loss of about three years of sales growth. Peter Fox-Penner, director of the Boston University Institute for Sustainable Energy, asserted in a recent op-ed that utility revenues will suffer because providers are halting shutoffs and deferring rate increases. Moreover, according to research firm Wood Mackenzie, the rise in household electricity demand won’t offset reduced business electricity demand, mainly because residential demand makes up just 40% of the total demand across North America.
Some utilities are employing AI and machine learning to address the windfalls and fluctuations in energy usage resulting from COVID-19. Precise load forecasting could ensure that operations aren’t interrupted in the coming months, thereby preventing blackouts and brownouts. And they might also bolster the efficiency of utilities’ internal processes, leading to reduced prices and improved service long after the pandemic ends.
Innowatts Innowatts , a startup developing an automated toolkit for energy monitoring and management, counts several major U.S. utility companies among its customers, including Portland General Electric, Gexa Energy, Avangrid, Arizona Public Service Electric, WGL, and Mega Energy. Its eUtility platform ingests data from over 34 million smart energy meters across 21 million customers in more than 13 regional energy markets, while its machine learning algorithms analyze the data to forecast short- and long-term loads, variances, weather sensitivity, and more.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Beyond these table-stakes predictions, Innowatts helps evaluate the effects of different rate configurations by mapping utilities’ rate structures against disaggregated cost models. It also produces cost curves for each customer that reveal the margin impacts on the wider business, and it validates the yield of products and cost of customer acquisition with models that learn the relationships between marketing efforts and customer behaviors (like real-time load).
Innowwatts told VentureBeat that it observed “dramatic” shifts in energy usage between the first and fourth weeks of March. In the Northeast, “non-essential” retailers like salons, clothing shops, and dry cleaners were using only 35% as much energy toward the end of the month (after shelter-in-place orders were enacted) versus the beginning of the month, while restaurants (excepting pizza chains) were using only 28%. In Texas, conversely, storage facilities were using 142% as much energy in the fourth week compared with the first.
Innowatts says that throughout these usage surges and declines, its clients took advantage of AI-based load forecasting to learn from short-term shocks and make timely adjustments. Within three days of shelter-in-place orders, the company said, its forecasting models were able to learn new consumption patterns and produce accurate forecasts, accounting for real-time changes.
Innowatts CEO Sid Sachdeva believes that if utility companies had not leveraged machine learning models, demand forecasts in mid-March would have seen variances of 10-20%, significantly impacting operations.
“During these turbulent times, AI-based load forecasting gives energy providers the ability to … develop informed, data-driven strategies for future success,” Sachdeva told VentureBeat. “With utilities and energy retailers seeing a once-in-a-lifetime 30%-plus drop in commercial energy consumption, accurate forecasting has never been more important. Without AI tools, utilities would see their forecasts swing wildly, leading to inaccuracies of 20% or more, placing an enormous strain on their operations and ultimately driving up costs for businesses and consumers.” Autogrid Autogrid works with over 50 customers in 10 countries — including Energy Australia, Florida Power & Light, and Southern California Edison — to deliver AI-informed power usage insights. Its platform makes 10 million predictions every 10 minutes and optimizes over 50 megawatts of power, which is enough to supply the average suburb.
Flex, the company’s flagship product, predicts and controls tens of thousands of energy resources from millions of customers by ingesting, storing, and managing petabytes of data from trillions of endpoints. Using a combination of data science, machine learning, and network optimization algorithms, Flex models both physics and customer behavior, automatically anticipating and adjusting for supply and demand patterns.
Autogrid also offers a fully managed solution for integrating and utilizing end-customer installations of batteries and microgrids. Like Flex, it automatically aggregates, forecasts, and optimizes capacity from assets at sub-stations and transformers, reacting to distribution management needs while providing capacity to avoid capital investments in system upgrades.
Autogrid CEO Dr. Amit Narayan told VentureBeat that the COVID-19 crisis has heavily shifted daily power distribution in California, where it’s having a “significant” downward impact on hourly prices in the energy market. He says that Autogrid has also heard from customers about transformer failures in some regions due to overloaded circuits, which he expects will become a problem in heavily residential and saturated load areas during the summer months (when air conditioning usage goes up).
“In California, [as you’ll recall], more than a million residents faced wildfire prevention-related outages in PG&E territory in 2019,” Narayan said, referring to the controversial planned outages orchestrated by Pacific Gas & Electric last summer. “The demand continues to be high in 2020 in spite of the COVID-19 crisis, as residents prepare to brace for a similar situation this summer. If a 2019 repeat happens again, it will be even more devastating, given the health crisis and difficulty in buying groceries.” AI making a difference AI and machine learning isn’t a silver bullet for the power grid — even with predictive tools at their disposal, utilities are beholden to a tumultuous demand curve. But providers say they see evidence the tools are already helping to prevent the worst of the pandemic’s effects — chiefly by enabling them to better adjust to shifted daily and weekly power load profiles.
“The societal impact [of the pandemic] will continue to be felt — people may continue working remotely instead of going into the office, they may alter their commute times to avoid rush hour crowds, or may look to alternative modes of transportation,” Schneider Electric chief innovation officer Emmanuel Lagarrigue told VentureBeat. “All of this will impact the daily load curve, and that is where AI and automation can help us with maintenance, performance, and diagnostics within our homes, buildings, and in the grid.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,739 | 2,019 |
"Google Cloud’s Contact Center AI hits general availability | VentureBeat"
|
"https://venturebeat.com/2019/11/14/google-clouds-contact-center-ai-hits-general-availability"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Google Cloud’s Contact Center AI hits general availability Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Google Cloud is making Contact Center AI generally available for use today. The cloud service is built with conversational AI engine Dialogflow to automate interactions with customers in call centers.
Contact Center AI contains Virtual Agent to automatically respond to customer queries with voice or text or handoff the conversation to a person when a bot is unable to help a customer. Agent Assist uses natural language processing to augment customer service agent interactions with customers when a bot is unable to help a customer.
The news comes today as Google pushed its rich communication services ( RCS ) to Android Messages users in the United States , and days after Google’s experimental unit Area 120 CallJoy service for answering phone calls and customer questions for small businesses got an upgrade.
Each of the services harnesses Google’s strength in conversational AI to form its overarching and rather comprehensive voice strategy.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Google today announced that Cloud Run , a service for managing Knative for serverless apps like Kubernetes, has hit general availability. On Wednesday, Google launched the Network Intelligence Center , an AI-driven network management service.
The series of announcements preclude the start of the Google Cloud Next conference in London next week.
Google launched Contact Center AI at a Cloud Next conference in summer 2018.
In other Google conversational AI news, Google Search got the ability to help people learn how to pronounce words.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,740 | 2,020 |
"Microsoft launches Cloud for Healthcare in general availability | VentureBeat"
|
"https://venturebeat.com/2020/09/22/microsoft-launches-cloud-for-healthcare-in-general-availability"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft launches Cloud for Healthcare in general availability Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
At Ignite 2020, Microsoft today announced that Microsoft Cloud for Healthcare will soon be generally available. The managed service offering spans Microsoft Dynamics 365, Microsoft Azure, and Microsoft 365 and is designed to help health organizations manage operations. Eligible customers and partners will be able to sign up for service at the end of October.
Cloud for Healthcare is in many ways Microsoft’s answer to Google Cloud Healthcare API , albeit more holistic. But it might also be perceived as a response to the increasing demand for triaging technologies in light of the pandemic. Millions of patients wait at least two hours to see a health care provider, according to a study published by the U.S. Centers for Disease Control and Prevention (CDC). In response, tech giants like IBM and Facebook have partnered with governments and private industry to roll out chatbot-based solutions, as have a number of startups.
Leveraging Cloud for Healthcare, Microsoft says health systems can deploy virtual visits and remote health monitoring as a part of connected experiences. The platform facilitates data collection from medical devices (through Azure IoT ) to allow care teams to monitor patients inside and outside clinical facilities, and it provide insights to help teams escalate care as needed and reduce readmissions.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Cloud for Healthcare also handles physician and referral management to expedite the creation of referrals and searches for providers, breaking out metrics like physician spend. On the patient side, Cloud for Healthcare hosts engagement portals, where patients can perform tasks like online booking, reminders, bill pay, and more while nurturing leads, publicizing events, and encouraging preventative and care management programs via interactive journeys.
Microsoft Teams is available to customers of Cloud for Healthcare, which Microsoft notes supports HIPAA and GDPR compliance and is HITRUST-certified.
The Bookings app in Teams enables care providers to schedule, manage, and conduct provider-to-patient virtual visits within Teams, and the new Teams EHR connector — now in private preview — will allow clinicians to launch a visit or consult with another provider from their electronic health record (EHR) system. Microsoft says Epic will be the first EHR system to integrate with Teams in this way.
Cloud for Healthcare affords access to the Microsoft Healthcare Bot Service for creating self-assessment tools, which could reduce strain on hotlines. Microsoft says more than 1,600 instances of COVID-19 bots created with the Healthcare Bot Service went live in 23 countries between March and May. These include bots from the U.S. Centers for Disease Control and Prevention.
For analytics, Cloud for Healthcare supports Fast Healthcare Interoperability Resources (FHIR), the standard describing data formats, elements, and APIs for exchanging EHRs. An Azure FHIR service allows health organizations to ingest and persist data in FHIR, while converters for Clinical Document Architecture (CDA) and other formats enable the reconciliation of records from different systems. (CDA is a markup standard that defines the structure of certain medical records, such as discharge summaries and progress notes.) Meanwhile, Dynamics 365 and Power Platform apps help read data in different formats and add visualizations and analytics.
Cloud for Healthcare also integrates with existing EHR and platform integrations, implementation services, and health care software-as-a-service offerings. Microsoft says it’s working closely with providers — including Accenture, Adaptive Biotechnologies, Allscripts, DXC Technology, Innovaccer, KPMG, and Nuance — to co-develop custom solutions. For example, Nuance and Microsoft partnered to integrate Nuance’s Dragon Ambient Experience with Teams. The integration, which is in private preview, captures and contextualizes physician-patient conversations, automatically documenting the encounters from within Teams.
Among the early adopters of Cloud for Healthcare are Providence St. Joseph, Helsinki University and Uusimaa Hospital District, St. Luke’s University Health Network, and Northwell Health.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,741 | 2,020 |
"Google launches Document AI suite of parsing and processing tools in preview | VentureBeat"
|
"https://venturebeat.com/2020/11/04/google-launches-document-ai-suite-of-parsing-and-processing-tools-in-preview"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Google launches Document AI suite of parsing and processing tools in preview Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Google this morning launched the Document AI (DocAI) platform, a console for document processing hosted in Google Cloud, in preview. The company says it’s aimed at automating and validating documents by extracting data from documents and making them available to business apps and users.
Companies spend an average of $20 to file and store a single document, by some estimates , and only 18% of companies consider themselves paperless.
An IDC report revealed that document-related challenges account for a 21.3% productivity loss, and U.S. companies waste a collective $8 billion annually managing paperwork.
Google’s DocAI platform ostensibly solves this by providing access to document parsers, tools, and solutions via an API. It supports the creation and customization of document processing workflows built with Google Cloud’s predefined taxonomy without the need to perform additional data mapping or training. DocAI offers general processors including a form parser, W9 parser, optical character recognition, document splitter, and custom workflows for domain-specific documents. These reside in a unified dashboard from where they can be tested by uploading a document directly in the console.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The parsers can classify information in documents like addresses, account numbers, and signatures as well as extract data like supplier names, invoice dates, and payment terms. Google says it’s working on additional capabilities for the DocAI platform to grow its core capabilities and support additional toolsets.
General parsers such as optical character recognition, the form parser, and the document splitter are available from the DocAI platform console. Access to specialized parsers like W9, 1040, W2, 1099-MISC, 1003, invoice, and receipts must be requested on a per-customer basis.
The launch of Google’s DocAI platform comes after the release of Lending DocAI, a Google Cloud product for the mortgage industry that ostensibly provides “industry-leading” accuracy for documents relevant to lending and processing. Google also recently unveiled PPP Lending AI , an effort to help lenders expedite the processing of applications for the since-exhausted U.S. Small Business Administration’s (SBA) Paycheck Protection Program, and Procurement DocAI, which automates procurement data capture by turning docs like invoices and receipts into structured data.
Lending DocAI and Procurement DocAI are now a part of the DocAI platform.
“We believe that any company that has to manually extract data from complex documents at scale can greatly benefit from Google Cloud AI,” Google product manager Lewis Liu and product marketing manager Yang Liang wrote in a blog post. “Transforming documents into structured data increases the speed of decision making for companies, unlocking measurable business value and helping develop better experiences for customers.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,742 | 2,020 |
"Amazon launches HealthLake, a platform for storing and analyzing health care data | VentureBeat"
|
"https://venturebeat.com/2020/12/08/amazon-launches-healthlake-a-platform-for-storing-and-analyzing-health-care-data"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon launches HealthLake, a platform for storing and analyzing health care data Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
During its re:Invent 2020 virtual keynote today, Amazon launched Amazon HealthLake, a service that enables health care organizations to store, transform, and analyze up to petabytes of life science data in Amazon Web Services. Amazon says that the HIPAA-eligible HealthLake , which is available in preview starting today, can automatically understand and extract medical information including rules, procedures, and diagnoses in real time.
Health care data is often spread across various systems such as electronic medical lab systems, and it’s challenging to organize because it’s often unstructured. Data in medical records like clinical notes, reports and forms like insurance claims, and image scans needs to be prepped and normalized before analyses can begin.
HealthLake aims to address this challenge by enabling customers to apply intelligence to hundreds of thousands of data points across different siloes in dozens of formats. For example, HealthLake leverages natural language understanding and ontology mapping to identify whether a patient has been properly prescribed a drug, pulling out information from blood glucose monitoring systems, physicians notes, insurance forms and lab reports, and more to inform its conclusions. Data can be loaded on an ongoing basis and queried and searched using standard methods or imported into Amazon SageMaker to train models to forecast metrics such as the number of diabetes cases year over year.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! During a livestreamed demonstration, AWS vice president for AI Matt Wood showed HealthLake being used to highlight a subset of patients with uncontrolled diabetes and adjust treatments to avoid severe complications. HealthLake queried data directly from a health care provider and generated a visualization in Amazon QuickSight that could be monitored in the context of other patients.
Amazon says that HealthLake recognizes interoperability standards including the Fast Healthcare Interoperability Resources (FHIR), a standard format to enable data sharing of health systems in a consistent format. Support for additional standards will arrive further down the line.
As we wrote last year, Amazon views AI in health care as a frontier worth exploring — and perhaps its next major revenue driver. The AI in health care market is anticipated to reach $19.25 billion by 2026, driven in part by a demand for telemedical and remote monitoring services. The launch of HealthLake comes a year after Amazon debuted Transcribe Medical , a service that’s designed to transcribe medical speech for clinical staff in primary care settings. And in 2018, Amazon made three AWS offerings HIPAA eligible — Transcribe, Translate, and Comprehend — following on the heels of rival Google Cloud.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,743 | 2,021 |
"Why Microsoft’s new AI acquisition is a big deal | VentureBeat"
|
"https://venturebeat.com/2021/04/17/why-microsofts-new-ai-acquisition-is-a-big-deal"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Why Microsoft’s new AI acquisition is a big deal Share on Facebook Share on X Share on LinkedIn View of a Microsoft logo on March 10, 2021, in New York.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Microsoft’s recent shopping spree reached a new climax this week with the announcement of its $19.7 billion acquisition of Nuance, a company that provides speech recognition and conversational AI services. Nuance is best known for its deep learning voice transcription service, which is very popular in the health care sector.
The two companies had already been working together closely before the acquisition.
Nuance had built several of its products on top of Microsoft’s Azure cloud. And Microsoft had been using Nuance’s Dragon service in its Cloud for Healthcare solution, which launched last year in the midst of the pandemic.
The acquisition is Microsoft’s biggest since the $26 billion purchase of LinkedIn. And it tells a lot about Microsoft’s AI strategy.
AI in health care Most of the focus in the announcement was on AI in health care , which makes sense because Nuance is a leading provider of AI services in the sector.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! “Nuance provides the AI layer at the health care point of delivery and is a pioneer in the real-world application of enterprise AI,” Microsoft CEO Satya Nadella said. “AI is technology’s most important priority, and health care is its most urgent application.” AI is technology’s most important priority, and healthcare is its most urgent application. Together with @NuanceInc , we will put advanced AI solutions into the hands of professionals to drive better decision-making and create more meaningful connections.
https://t.co/ipdP6qZTx9 — Satya Nadella (@satyanadella) April 12, 2021 One thing I like about Nuance is its laser focus, which is in line with the current limits and capabilities of deep learning algorithms.
Deep learning might not be very good at general problem-solving or causal inference, but it can be extremely efficient at narrow tasks. Nuance has chosen one application (voice transcription) and has narrowed down its focus to one domain (clinical settings). This has enabled the company to train its machine learning models on tons of data in that specific field and make sure that its AI solutions have peak performance and reliability.
Nuance has a series of AI products tailored for clinical settings, including a virtual assistant for electronic health records, a multi-party conversation transcription service, and a deep learning language model that converts clinical conversations into structured notes for integration into health records.
Documentation is one of the main pain points for clinics and one of the lowest-hanging fruits for AI in health care.
Nuance’s AI technology is helping save time and improve the patient experience. According to the acquisition announcement, Nuance’s AI solutions are currently used by more than 55% of physicians and 75% of radiologists in the U.S. and used in 77% of U.S. hospitals. The company has also seen a 37% year-over-year growth in the revenue of its cloud service, though that is probably due to the shifts caused by the COVID-19 pandemic.
“The acquisition will double Microsoft’s total addressable market (TAM) in the health care provider space, bringing the company’s TAM in health care to nearly $500 billion,” according to Microsoft’s announcement.
Microsoft’s position in health care Nuance’s reach in the health care market suggests Microsoft will recoup its $19.7 billion investment in a relatively short term. But being able to address this market is not a simple feat.
Other big tech companies, such as Apple and Google, already have health care initiatives that are much older than Microsoft’s. But Microsoft is especially well-positioned to take advantage of this new acquisition because of its business model.
Google and Apple are consumer companies. Microsoft, on the other hand, gets most of its revenue from enterprise customers. Its Office suite and its collaboration tools were already being used in many hospitals even before it announced its health care solution. That’s why it was already in a good spot to penetrate the market.
And if you look over at the Cloud for Healthcare page , the company has done a great job of integrating its health solution into tools that many health care workers were already used to working with, such as Outlook, Teams, Office, and messaging apps. The real advantage is the infrastructure Microsoft has built, the integration of all these services with clinical applications, and terrific data engineering that makes it possible to deploy machine learning models and data analytics tools that span various data sources.
This is the perfect infrastructure on top of which Microsoft can build an AI factory , where it creates machine learning models that provide ways to improve existing products and build new ones. The acquisition will enable Microsoft to accelerate its growth by leveraging Nuance’s reach in the health care sector. Now every Nuance customer will also be a Microsoft customer.
Beyond health care Before the acquisition, Microsoft was already using Nuance’s Dragon AI technology in its health care solution, transcribing virtual visits, taking notes, and integrating information into patients’ health records. Now, with the acquisition of Nuance, Microsoft will also have full access to its technology and will be able to take its new AI transcription power beyond health care.
“Beyond health care, Nuance provides AI expertise and customer engagement solutions across Interactive Voice Response (IVR), virtual assistants, and digital and biometric solutions to companies around the world across all industries,” Microsoft says in its blog.
It will be interesting to see how Nuance’s technology will be integrated into other Microsoft enterprise products.
One thing that is also worth watching is how Microsoft will be able to combine Nuance’s AI with other technologies it’s experimenting with. For instance, Microsoft already has an exclusive license to OpenAI’s GPT-3 language model. Nuance’s transcription technology and GPT-3 might become a powerful combination for the enterprise.
Microsoft’s AI strategy Microsoft might not be able to predict which company will be successful in five years’ time, especially in a field as volatile as AI. But it’s banking on the one constant that is always needed in the field: compute power. Microsoft uses its huge Azure platform to develop ties with companies, often providing them with subsidized access to its cloud-based machine learning tools. It also makes many of its investments in Azure credits, ensuring companies it invests in will be locked into its platform. This puts Microsoft in a position to both help those companies grow and learn from them. And the investment pays off when the company’s technology and business model mature.
Earlier this year, I wrote about Microsoft’s investment in the self-driving car startup Cruise , which also made Microsoft Azure the preferred cloud of Cruise and its owner General Motors. I noted at the time that Microsoft’s success is in maintaining a safe distance from developing sectors. Instead of making one big acquisition, Microsoft casts a wide net by making smaller investments in several companies.
This gives it a good foothold into many innovative sectors. As these sectors mature, Microsoft is gradually entering partnerships with the more successful startups. And when the time is right, it will acquire the company that gives it the best leverage in the market.
We can see this exact cycle with Nuance as Microsoft evolved from being Nuance’s cloud provider to its partner to its owner. And this evolution tells us a lot about Microsoft’s AI strategy, which I think is very smart, given how fast things can change in the AI industry. The enterprise AI sector has come a long way toward creating applications that can solve real-world problems. But we still haven’t figured out many things. And as new technologies and companies continue to develop, Microsoft will be watching and picking winners.
Ben Dickson is a software engineer and the founder of TechTalks, a blog that explores the ways technology is solving and creating problems.
This story originally appeared on Bdtechtalks.com.
Copyright 2021 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,744 | 2,021 |
"SurveyMonkey aligns with ServiceNow on digital business transformation | VentureBeat"
|
"https://venturebeat.com/2021/01/13/surveymonkey-aligns-with-servicenow-on-digital-business-transformation"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages SurveyMonkey aligns with ServiceNow on digital business transformation Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
ServiceNow and SurveyMonkey today announced that they have teamed up to make it simpler for organizations to employ surveys and better gauge the impact digital business transformation initiatives are having on their organization.
As a consequence of the alliance, business leaders will be able to better understand how employees and end customers perceive any change a business makes — without having to launch a separate application. The alliance comes at a time when SAP is moving toward spinning out rival Qualtrics as an independent public company with a valuation of $14.4 billion. Medallia and Zoho are among other rivals that provide standalone tools for tracking customer and employee experiences using surveys.
Rather than requiring end users to access a separate application to monitor their customer and employee experience, it makes sense to surface those insights within the context of a workflow process already managed by ServiceNow, SurveyMonkey president Tom Hale said in an interview with VentureBeat. “There’s been an evolution,” he explained. “The best practice is now you do this as part of a system.” That battle for control over customer and employee experience monitoring is intensifying as organizations accelerate nascent digital business transformation projects. Earlier this week, Microsoft corporate VP for commercial management experiences Brad Anderson revealed he is leaving to join Qualtrics.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Ironically, ServiceNow is currently led by Bill McDermott, the former CEO of SAP who spearheaded the acquisition of Qualtrics in 2019. Hale noted that in both positions, McDermott has recognized the critical role customer and employee experience plays in digital business transformation.
SurveyMonkey has already established relationships with other leading enablers of digital business transformation, such as Salesforce. The data SurveyMonkey collects via hundreds of millions of surveys conducted each month will help continuously shape those processes, Hale noted, adding that organizations are otherwise blindly transforming processes without any kind of feedback loop from customers and employees.
Most of the usage SurveyMonkey sees today is still via its web interface, However, as digital business transformation initiatives continue to expand, Hale said SurveyMonkey surveys will increasingly be surfaced within third-party applications. And much of that data will soon also be analyzed by machine learning algorithms within both the SurveyMonkey platform and the AI models constructed by partners such as ServiceNow and Salesforce.
Even though it appears COVID-19 vaccines might be widely distributed by the middle of this year, most organizations are never going to return to business as usual. Customers and employees have become accustomed to a certain level of flexibility that has arisen primarily because many processes have already moved online.
Those online processes may not be especially sophisticated yet, but it’s clear companies will have to transform. Otherwise, customers will look for more flexible entities to transact with while employees shift allegiance to employers that don’t require them to commute to an office every day.
As a consequence, the level of investment being made in digital business transformation initiatives is expected to remain high through 2023. IDC is forecasting that digital transformation (DX) investments will grow at a 15.5% compound annual growth rate (CAGR) to reach $6.8 trillion over the next three years.
Precisely how that spending will manifest itself remains to be seen. But companies will undoubtedly need tools to measure the actual impact of any digital business transformation initiative they undertake.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,745 | 2,021 |
"ServiceNow adds new AI and low-code development features | VentureBeat"
|
"https://venturebeat.com/2021/03/11/servicenow-adds-new-ai-and-low-code-development-features"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages ServiceNow adds new AI and low-code development features Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Enterprise cloud-based solutions provider ServiceNow today launched its Now Platform Quebec release, which the company says is designed to help enterprises innovate more quickly in a world changed by the pandemic. Quebec brings several new AI and machine learning-powered and low-code capabilities, including a predictive AIOps feature that anticipates issues and automates resolution and a virtual agent that provides guided setup and topic recommendations.
“In today’s challenging environment, organizations worldwide are pivoting fast, adopting new, distributed models of working and creating new workflow‑enabled ways of operating with more agile, resilient, digital enterprise value chains,” Chirantan Desai, chief product officer at ServiceNow, said in a blog post. “Customers are relying on ServiceNow’s Now Platform to deliver enterprise digital workflows, create new business models, enhance productivity and enable great customer and employee experiences in any operating environment. This newest version of the Now Platform further enhances the must‑have enterprise digital tools customers need today.” Among the highlights in Quebec are ITOM Predictive AIOps, which builds on ServiceNow’s Loom acquisition in January 2020. TOM Predictive AIOps aims to give users deeper insights into their digital operations to minimize and fix issues before they become real problems.
AIOps, short for AI for IT operations, is a category of products that enhance IT by leveraging AI to analyze data from tools and devices. Research and Markets anticipates it will grow by $14.3 billion to be a $20.1 billion market by 2027. That might be a conservative projection in light of the pandemic, which is forcing IT teams to increasingly conduct their work remotely. In lieu of access to infrastructure, AIOps solutions could help prevent major outages, the cost of which a study from Aberdeen Research pegged at $260,000 per hour.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Quebec also introduces AI Search, underpinned by technology acquired in ServiceNow’s purchase of Attivio. AI Search delivers intelligent search results and actionable information, complementing Quebec’s Engagement Messenger that extends self-service to third-party portals to enable AI search, knowledge management, and case interactions. Also new in Quebec is the aforementioned virtual agent, which delivers AI-powered conversational experiences for IT incident resolution.
ServiceNow also today unveiled Creator Workflows, a set of low-code development tools to transform manual processes into digital workflows. App Engine Studio offers a development environment where users can collaborate and build applications, while App Engine Templates gives teams access to prebuilt workflow building blocks.
“As businesses shift from emergency response to long‑term recovery and distributed work becomes the norm, organizations are accelerating digital transformation efforts and investing in new technologies that promote continuity and agility,” Philip Carter, group VP at IDC, said in a statement. “The ability to deliver end‑to‑end digital experiences for employees and customers alike will be a crucial competitive differentiator. There is significant customer traction, accelerated by the pandemic, for unified technology platforms that connect systems, silos and processes to enable these connected, digital‑first enterprise models.” Now Platform Quebec is now available to ServiceNow customers. Customers using it include Nike, Adobe, Deutsche Telekom, Logitech, Medtronic, and St. Jude Children’s Research Hospital.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,746 | 2,020 |
"CryptoKitties creator Dapper Labs raises $12 million for consumer-focused Flow blockchain | VentureBeat"
|
"https://venturebeat.com/2020/08/06/cryptokitties-creator-dapper-labs-raises-11-4-million-for-consumer-focused-flow-blockchain"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages CryptoKitties creator Dapper Labs raises $12 million for consumer-focused Flow blockchain Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
CryptoKitties creator Dapper Labs has raised $12 million in funding from NBA stars and others to build a consumer-focused Flow blockchain that supports digital collectibles. The company aims to enable users to own the digital items they collect with the assurance that the goods are genuine.
Dapper Labs will use the round to help scale the Flow blockchain to support previously announced collaborations with Dr. Seuss Enterprises , Warner Music, and UFC, among others. CEO Roham Gharegozlou said in an email to GamesBeat that Flow will be an easy-to-use blockchain that enables quick access for anyone looking to join the transparent and secure decentralized digital ledger technology.
Above: These Dr. Seuss digital collectibles are based on blockchain technology.
Dapper Labs also announced initial results from its first phase of beta testing for NBA Top Shot, its upcoming game developed for the Flow blockchain. NBA Top Shot has already generated over $1.2 million in revenue from the first 500 players while in closed beta. These players have spent thousands of hours opening packs, trading moments, and completing collections together.
Top Shot was designed in partnership with the NBA and NBPA. The experience captures the feeling of trading cards and sneakers, but it does so within a digital universe. New invitations to the beta are being released in waves; to apply, sign up here and then visit this Discord to request early access.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Investors in the round include Andre Iguodala of the Miami Heat basketball team, JaVale McGee (Los Angeles Lakers), Spencer Dinwiddie (Brooklyn Nets), Garrett Temple (Brooklyn Nets), and Aaron Gordon (Orlando Magic).
Other investors include Samsung, Andreessen Horowitz, Union Square Ventures, Coinbase Ventures, Distributed Global, Valor Capital Group, A.Capital, BlockTower Capital, Blockchange Ventures, EONXI Ventures, Reed Company, Greenfield One, North Island Ventures, Republic Labs, L1 Digital AG, and Pirata Capital.
Dapper Labs is based in Vancouver, Canada and has 85 employees. The company has raised $51.05 million to date.
Update, 7:52 a.m. Pacific: Dapper Labs says the amount raised was $12 million.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,747 | 2,018 |
"Apple Music will come to Amazon Echo devices in mid-December | VentureBeat"
|
"https://venturebeat.com/2018/11/30/apple-music-will-come-to-amazon-echo-devices-in-mid-december"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Apple Music will come to Amazon Echo devices in mid-December Share on Facebook Share on X Share on LinkedIn Apple has been working to expand the availability of its Apple Music service.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Instead of keeping streaming music service Apple Music exclusive to its own devices and apps, Apple has let Google Android and Facebook Messenger users sign on in previous years. Now the service is coming to Amazon’s Echo devices, Amazon announced today, with full Alexa integration.
Apple Music for Echo will provide access to over 50 million songs, enabling Alexa to stream everything from individual tracks and playlists to radio stations, even including Beats 1. Users will just have to enable the Apple Music skill in the Alexa app and link their accounts to make the new feature work.
“Music is one of the most popular features on Alexa — since we launched Alexa four years ago, customers are listening to more music in their homes than ever before,” said Amazon Devices SVP Dave Limp. “We’re thrilled to bring Apple Music — one of the most popular music services in the U.S. — to Echo customers this holiday.” With over 50 million subscribers, Apple Music is believed to have surpassed Spotify in U.S. installed base during the summer, but both services continue to grow in popularity. Regardless whether it’s currently number one or two in the country, however, Apple Music support is a win-win for large numbers of Apple and Amazon users.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Amazon’s frosty relationship with Apple has thawed somewhat in recent months, as the sometime rivals reached an agreement to bring new Apple devices back to Amazon’s retail storefront, pushing small, unauthorized vendors out of the store. While Amazon opted not to sell Apple products that compete directly with its own audio and video offerings, it is offering iPads, iPhones, and Macs, as well as accessories.
The exact date for the Echo Apple Music rollout isn’t clear, but Amazon says Apple Music will start to become available on Echo devices during “the week of December 17.” That’s just in time for the end of the 2018 holiday season and a good opportunity for Apple to keep some new iPhone and iPad users from switching to Amazon’s Music Unlimited service, which has become competitive on both price and music catalog size as it chases Apple Music subscribers.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,748 | 2,020 |
"Super Mario 3D All-Stars becomes a fast sales hit for Nintendo | VentureBeat"
|
"https://venturebeat.com/2020/10/16/super-mario-3d-all-stars-becomes-a-fast-sales-hit-for-nintendo"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Super Mario 3D All-Stars becomes a fast sales hit for Nintendo Share on Facebook Share on X Share on LinkedIn Super Mario 3D All-Stars.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
We can complain about its limited availability or half-hearted porting job all we want, but Super Mario 3D All-Stars has become a big sales success for Nintendo.
The compilation was the No. 2 best-selling game in the U.S. in September, according to The NPD Group. It is also already the No. 10 best-selling game of the year so far. Super Mario 3D All-Stars launched for Switch on September 18. It includes Super Mario 64, Super Mario Sunshine, and Super Mario Galaxy.
“Super Mario 3D All-Stars launch month physical dollar sales rank as the 6th biggest for a Nintendo-published title in U.S. history,” NPD video game industry analyst Mat Piscatella noted. “Super Mario 3D All-Stars trails only Super Smash Bros. Ultimate, Super Smash Bros. Brawl, Animal Crossing: New Horizons, The Legend of Zelda: Breath of the Wild, and Pokémon Stadium in physical launch month dollar sales.” For whatever reason, 3D All-Stars will only be available for purchase until March 31. That forced scarcity could be what is driving up these launch sales numbers.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Or people could just be happy to have three classic 3D platformers on their Switch. Now that Nintendo knows these games can sell, maybe we’ll get Super Mario Galaxy 2 on Switch.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,749 | 2,021 |
"GamesBeat Decides: The best (and worst) 'core' Mario games | VentureBeat"
|
"https://venturebeat.com/2021/01/24/gamesbeat-decides-the-best-and-worst-core-mario-games"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages GamesBeat Decides: The best (and worst) ‘core’ Mario games Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Many Mario games are some of the best pieces of digital entertainment ever created. The series seemed like an obvious choice for a new tier list-making video series … so here we are.
My cohort Jeff Grubb and I rank every game in the core Mario series. What does “core” mean? It’s whatever we say it means.
There is some logic to our process, which we explain in the video above. And, yes, few Mario games are “bad.” But there are a few entries in the franchise that provoke little more than shoulder shrugs.
If you want to make your own Mario tier list, you can follow this link.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,750 | 2,021 |
"Mario Golf: Super Rush is coming to Switch | VentureBeat"
|
"https://venturebeat.com/2021/02/17/mario-golf-super-rush-is-coming-to-switch"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Mario Golf: Super Rush is coming to Switch Share on Facebook Share on X Share on LinkedIn Mario Golf: Super Rush.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Nintendo announced today during its Nintendo Direct presentation that a new Mario Golf, subtitled Super Rush, is coming to Switch. It will launch on June 25.
Mario Golf is one of the most popular Mario sports franchises that Nintendo makes. The Switch already got a Mario Tennis title with Mario Tennis Aces in 2018.
The last Mario Golf, World Tour, came out for the 3DS in 2014. This new game on Switch lets use normal button controls or motion controls with a single Joy-Con controller.
Super Rush also has an RPG-based story mode where you play as a Mii. The much-loved Mario Golf: Advance Tour for the Game Boy Advance also had an RPG-based story mode.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! And it has a speed mode in which everyone in your group tees off at the same time and then races down the fairway.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,751 | 2,019 |
"Analogue Pocket is a gorgeous handheld for playing for retro games | VentureBeat"
|
"https://venturebeat.com/2019/10/16/analogue-pocket-handheld"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Analogue Pocket is a gorgeous handheld that plays Game Boy games and more Share on Facebook Share on X Share on LinkedIn The Analogue Pocket is ready to take over handheld retro gaming .
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Ever since Analogue released its Nt Mini console that re-creates Nintendo Entertainment System hardware, I wondered when it would do the same for Game Boy. Well, here comes the Analogue Pocket. This is a sleek device that looks like a modern reinvention of a Game Boy Pocket. The difference is that it has four face buttons, two shoulder buttons, and a hyper high-resolution screen. Oh, and it can play Game Boy, Game Boy Color, Game Boy Advance, Game Gear, Neo Geo Pocket Color, and Atari Lynx games.
The Analogue Pocket is coming in 2020 for $200.
And while that might seem expensive, Analogue is piling on features to ensure you get your money’s worth. This is likely the best way to experience the early generations of handheld gaming.
“I’ve wanted to make Pocket for 10 years,” Analogue boss Christopher Taber told GamesBeat. “Nearly all of Analogue’s history leads up to this product. Pocket is Analogue’s Illmatic. It’s legendary.” All your cartridges will work — although you will need an adapter for anything that isn’t in the Game Boy family. Those games will also output to a higher resolution, which enables them to take advantage of the Pocket’s 3.5-inch 1,600-by-1,400 display. That screen has 615 dot-per-inch, which is higher than an iPhone 11 Pro Max.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! “Pocket’s display is 10 times the resolution of an original Game Boy,” said Taber. “It has 615 ppi. It outclasses any screen in a portable game system by a long shot. Something like the Switch screen doesn’t even come close. At 3.5 inches and in it’s aspect ratio, it’s about the same size as a 3DS XL screen. It’s big.” And as with previous Analogue consoles, the Pocket uses field-programmable gate-array chips. This enables Taber’s engineers, particularly the brilliant Kevin “Kevtris” Horton, to build cores that exactly mimic the behavior of the original hardware. This should guarantee that the Pocket is among the most accurate ways to play these games — especially in a handheld.
Analogue Pocket is the retro-gaming device everyone was waiting for The Analogue Pocket is not just about plugging in your old games. Analogue is including a synthesizer and sequencer program called Nanoloop for creating music. That could turn it into a popular tool for electronic music artists.
“Nanoloop is a legendary piece of music making software designed for the GBA,” said Taber. “It costs €50 for the cartridge to use it on an original GBA, so we are thrilled to have it included on Pocket — it’s a powerful tool and a huge value add.” The biggest feature, however, is the inclusion of a second FPGA chip that Analogue is opening up to developers. This will empower the classic-gaming community to build and port their own cores to run other games on the Pocket similar to how the MiSTER FPGA device works. That is huge, and it’s awesome that it is coming in what looks like an awesome handheld form factor with an incredible screen.
You can even purchase a separate dock to output the Pocket to a television via HDMI. The dock doesn’t have a price yet.
“Pocket is the conclusion to all of retro portable gaming history,” said Taber. “It completes an entire era of portable gaming all in one product. Able to be played portably, on your HD with Dock, and even on CRTs and PVMs with Pocket + Dock + DAC.” Analogue Pocket is a leap forward for portable retro gaming I haven’t tried the Analogue Pocket yet, but let’s assume it works as well as the company’s excellent Super Nt and Mega Sg devices.
In that case, the Pocket should flip the entire retro portable gaming scene on its head.
Right now, you can go on Amazon and get a handheld that emulates old games for like $40. And these devices are fine. They have decent chips based on Raspbery Pi and basic LCD screens. But those products always use software emulation, and that means a lot of games are inaccurate. For example, I hate playing Rhythm Heaven on them because the audio is always off, which ruins that game.
But the Pocket is the complete opposite. It emulating the hardware, and Analogue dedicates itself to accuracy. And the specs make it sound like it will have the best screen on any handheld ever. That will be a huge change from the cheap displays you find in a lot of portable emulator consoles.
Finally, the potential to support cores for other devices makes this a must-have device for retro-gaming enthusiasts. The MiSTER is a great way to do that now, but that costs in the neighborhood of $175 , and that just gets you naked kit of electronics. It’s not a polished product like the Analogue Pocket.
So yeah, count me in.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
Join the GamesBeat community! Enjoy access to special events, private newsletters and more.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,752 | 2,019 |
"OneLogin raises $100 million to help enterprises manage access and identity | VentureBeat"
|
"https://venturebeat.com/2019/01/10/onelogin-raises-100-million-to-help-enterprises-manage-access-and-identity"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages OneLogin raises $100 million to help enterprises manage access and identity Share on Facebook Share on X Share on LinkedIn The reception area at OneLogin's San Francisco office.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Not every startup manages to reach — or exceed — $100 million in financing. Those that do join an exclusive group of fundraisers and can enjoy a massive vote of confidence from the venture capitalists and angels who help put them there.
As of this morning, OneLogin is the latest to join the club. The San Francisco cloud-based identity and access management provider today announced that it has secured $100 million in a Series D round led by Greenspring Associations, with contributions from existing investors Charles River Ventures and Scale Venture Partners. It’s very much justified: OneLogin now counts more than 2,500 enterprises among its clients, including California’s Bay Area Rapid Transit, Airbus, British Red Cross, Change.org, Fujitsu, Indeed, Nasa, Pandora, Softbank, and Broward College. And it has more than tripled its annual recurring revenue (ACR) over the past three years.
OneLogin raised $22.5 million in June 2018 as part of an extension of its Series C round, following $10 million and $25 million raises in May 2017 and December 2014, respectively. The Series D brings its total capital raised to more than $170 million.
OneLogin says that the influx of cash will be put to “accelerat[ing] adoption” of new product offerings, such as multi-factor authentication, better serving its enterprise customers, and extending access management to networks and devices using cloud infrastructure. It also says the funds will be used to increase OneLogin’s North American and European footprints and to build out the startup’s 250-person team “across levels and disciplines” and further expand in Europe.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! OneLogin, for the uninitiated, provides a suite of tools that help organizations manage distinct app environments, networks, and devices in a unified fashion. Its cloud-based solution — appropriately dubbed Unified Access Platform — offers features such as single sign-on, compliance reporting, and a centralized cloud directory that collates directories from G Suite, Workday, and other providers. It taps machine learning to detect high-risk login attempts and trigger additional authentication factor requests, in part by taking into account the networks users are connected to and the devices they’re using. And its mobile identity management product enables one-click access to web enterprise apps through a smartphone or tablet device.
Last year saw the launch of OneLogin access, a new iteration of OneLogin’s Web Access Management product that offers a single tool for managing identities across on-premises applications and public cloud environments. It’s a capability that sets OneLogin apart from competitors that don’t offer comparable on-premise integrations.
OneLogin has been largely frugal with its money so far, save a few strategic acquisitions to bolster its portfolio. It bought CafeSoft, a web access management startup, in 2015 and in June 2016 acquired Portadi, a San Jose startup that built a framework for creating custom connectors to third-party enterprise apps. A few months later, it picked up Square Secure Workspace, which engineered a lightweight virtual mobile container solution that isolates employees’ work content from their personal content.
There’s been a bump or two along the way to OneLogin’s Series D, to be sure. In May 2017, an attacker managed to break into one of its Amazon Web Services accounts, which necessitated swapping out every security certificate connected to services its customers used with OneLogin’s platform. Unsurprisingly, second-quarter revenue took a hit, and OneLogin lost employees and customers.
OneLogin CEO Brad Brooks took the incident as a challenge to improve the company’s security practices. There was another upside, he believes: Emerging from a breach relatively unscathed made OneLogin a more appealing vendor. Indeed, Airbus — one of OneLogin’s marquee clients — announced after the hack occurred that it would adopt OneLogin.
“There are some customers that did leave us. They said, ‘You know what, [we] can’t handle it’,” he told VentureBeat in an earlier interview.
“Most of them stayed with us … [That breach] has made us who we are. It didn’t kill us, but it certainly made us stronger.” There are more hurdles on the horizon. OneLogin faces stiff competition from the likes of Okta, Ping, and Centrify, not to mention behemoths such as Microsoft, Oracle, and IBM. In the last fiscal quarter, Okta’s subscription revenue grew 59 percent year-over-year to reach $76.8 million.
But OneLogin’s investors aren’t worried.
“I’m happy to say I still feel I get to be in one of the top two — or maybe three, if you count Microsoft — players in the space,” Rory O’Driscoll, a partner with Scale Venture Partners, told VentureBeat in an earlier interview.
OneLogin recently opened offices at the Georgia tech accelerator Atlanta Tech Park and a development center in Seattle. Its goal, Brooks told VentureBeat earlier this year, is to hit $100 million in ACR within the next two and a half years.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,753 | 2,021 |
"OwnBackup raises $167.5 million to bring cloud data backups to Salesforce and beyond | VentureBeat"
|
"https://venturebeat.com/2021/01/28/ownbackup-raises-167-5-million-to-bring-cloud-data-backups-to-salesforce-and-beyond"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages OwnBackup raises $167.5 million to bring cloud data backups to Salesforce and beyond Share on Facebook Share on X Share on LinkedIn OwnBackup homepage Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
OwnBackup , a platform that offers cloud data backup and recovery services to Salesforce customers, has raised $167.5 million in a series D round of funding co-led by Insight Partners, Salesforce Ventures, and Sapphire Ventures. The company is now valued at nearly $1.4 billion.
Founded out of New Jersey in 2015, OwnBackup promises administrators and developers peace of mind with a platform that provides backup, recovery, compare, and archive functionality for complex datasets. Although OwnBackup is better known for its integration with the Salesforce platform, it has sought to expand its coverage to other ecosystems, including Sage , Ncino, and Veeva.
As more companies embrace cloud-based services, a trend that has been expedited by the pandemic, the need to provide end-to-end data protection — including creating backups and recovery — has become urgent. Indeed, there has been a lot of activity in the space of late, with Insight Partners — an OwnBackup investor — acquiring Veeam for $5 billion last year, while Cohesity secured $250 million at a $2.5 billion valuation.
OwnBackup had raised around $100 million before now, including a $50 million tranche less than a year ago, and with its latest cash injection the company plans to continue its global expansion and enable companies of all sizes to “secure their most mission-critical SaaS data.” Innovation Endeavors, Vertex Ventures, and Oryzn Capital also participated in the round.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,754 | 2,021 |
"Scratchpad raises $13 million to power up Salesforce with a productivity workspace for sales teams | VentureBeat"
|
"https://venturebeat.com/2021/02/03/scratchpad-raises-13-million-to-power-up-salesforce-with-a-productivity-workspace-for-sales-teams"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Scratchpad raises $13 million to power up Salesforce with a productivity workspace for sales teams Share on Facebook Share on X Share on LinkedIn Scratchpad cofounders Pouyan Salehi (CEO) and Cyrus Karbassiyoon CTO) Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
If you work in sales, there’s a good chance Salesforce will be a critical tool — the customer relationship management (CRM) platform claims nearly a fifth of the market in terms of revenue, more than triple its nearest rival.
But pervasiveness doesn’t necessarily mean popularity, with common Salesforce complaints ranging from “clunky UI” and “too much data entry” to “it’s just not user friendly.” But one startup is setting out to make Salesforce a more enjoyable and productive experience by building a modern workspace directly on top of the CRM.
Founded out of San Francisco in 2019, Scratchpad has created a suite of productivity tools spanning notes, spreadsheets, tasks, Kanban boards, search, collaboration, and more. The company bundles them under a friendlier interface through which they can interact directly with all their sales data and workflows. Scratchpad is designed to free up sales personnel to do what they do best and today announced it has raised $13 million in a series A round of funding led by Craft Ventures, with participation from Accel. Craft Ventures’ David Sacks will now join Scratchpad’s board.
“Most revenue teams have highly trained and highly educated salespeople spending more than half of their time doing administrative work instead of selling,” Scratchpad CEO and cofounder Pouyan Salehi told VentureBeat. “Scratchpad reduces and nearly eliminates tedious admin time, increasing sales performance across the organization.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Scratch that The Scratchpad platform was developed from “thousands of conversations with sales professionals and designed specifically for their needs,” Salehi said. This includes Scratchpad Notes, which sales reps can use during or after calls to record insights or collaborate with others — and which connects directly into Salesforce. The platform also offers different views for managing pipelines, such as a grid view, which is basically inline editing for Salesforce records, circumventing the need for spreadsheets.
Above: Scratchpad: Grid view Users can also create Kanban boards to provide a more detailed overview of deals at every stage.
Above: Scratchpad: Kanban board Scratchpad also offers a search interface that enables users to search for accounts, contacts, leads, notes, and more — this includes content stored in Gmail, Google Calendar, Slack (which Salesforce is on the cusp of buying for $27.7 billion), or the web.
“If you are searching for a prospect on LinkedIn using Google Chrome, you can simply use Scratchpad search to see if the prospect is already in your Salesforce instance and if that person’s details need to be updated,” Salehi explained. “Or if you’re looking at your calendar for an upcoming meeting with a customer, you can access all of your relevant Salesforce information related to the customer directly from the calendar event without having to switch tabs to Salesforce.” Elsewhere, Scratchpad is designed with collaboration in mind, enabling users to share sales notes with other members of their team — so if a sales executive finalizes a deal, all notes can be easily passed to the customer service team.
Scratchpad operates a software-as-a-service (SaaS) business model, with tiers ranging from free to the $39 per-user team plan and up to the business plan, which comes with customized pricing. In terms of deployment, users can either add an extension to Chrome or go through a dedicated web app in any other web browser. The user then logs into Scratchpad using their Salesforce credentials.
Ecosystem Salesforce is already a fairly extensible platform, with AppExchange offering a ton of integrations with third-party enterprise applications. In fact, Salesforce CEO Marc Benioff coined and trademarked the term “App Store” and registered the domain before gifting it all to Steve Jobs ahead of Apple’s venture into the world of smartphones. Salesforce’s success over the past couple of decades is in large part due to its partnership ecosystem, with countless third parties building highly successful businesses off its back.
So isn’t what Scratchpad is now offering already possible through other integrations? Not so, says Salehi, who explains the issue isn’t so much extensibility as it is usability.
“Usability is what matters most for revenue teams,” he said. “It is why you see implementations of Salesforce with all sorts of customizations and extensions, and yet adoption of those customizations remains very poor. In fact, we see a negative correlation between extensibility and end-user adoption. Many of these extensions are primarily at the database layer of Salesforce. Very few extend to the user interface, so they are not optimized for end users.” While workflows vary greatly between Salesforce customers, many users will keep notes or to-do lists in separate documents, such as Word, Excel, or a Google Doc, and then copy/paste the relevant text into Salesforce. Alternatively, they may just “stick with the status quo,” as Salehi puts it, and update records directly in Salesforce.
“The unfortunate reality is Salesforce is a phenomenal database but is not user friendly for salespeople to actually use,” he said. Scratchpad is ultimately looking to reduce the hours wasted on admin.
In its short life, Scratchpad has managed to amass a pretty impressive roster of customers, including Adobe, Autodesk, Box, Snowflake, Splunk, and Twilio. And while it was built purely for Salesforce, the company is open to expanding its horizons in the future.
“We do have plans to bring Scratchpad to other CRMs, but today the pain is so deep and pervasive for sales teams using Salesforce, we started there,” Salehi said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,755 | 2,019 |
"Productiv raises $8 million to help companies track SaaS app engagement | VentureBeat"
|
"https://venturebeat.com/2019/04/01/productiv-ai-raises-8-million-to-help-companies-track-saas-app-engagement"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Productiv raises $8 million to help companies track SaaS app engagement Share on Facebook Share on X Share on LinkedIn Productiv's cloud dashboard surfaces organization-wide app metrics.
Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Enterprise software-as-a-service (SaaS) adoption has never been higher — companies use 16 SaaS apps on average and 73 percent say nearly all of their apps will be SaaS by 2020, driving the global industry to an estimated $185.8 billion in the next year. But coinciding with this climb is a decline in app usage transparency. A recent survey of IT leaders conducted by Numerify found that 45 percent don’t have a complete picture of key apps and business health services, with 38 percent admitting they couldn’t analyze IT data on the fly and 57 percent saying they lacked an overview of IT performance across projects and employees.
The market’s relative opaqueness motivated Jody Shapiro, formerly head of product management for Google Analytics, to investigate a metrics-driven solution. Unable to find one, he developed his own in Productiv, which emerged from stealth this week with $8 million in series A funding led by Accel partner and former CEO of RelateIQ (now SalesforceIQ) Steve Loughlin.
“CIOs at Fortune 1000 companies tell us that they want to drive new business value from their software investments. Many of those companies spend tens or hundreds of millions annually on SaaS alone,” Shapiro told VentureBeat via email. “Those CIOs tell us that application engagement insights give an unprecedented view into SaaS application usage, and it’s helping them understand how those employees collaborate and drive their teams’ productivity.” Above: Identifying most-used SaaS apps with Productiv.
To that end, Productiv’s cloud-based dashboard integrates with single sign-on tools to track login activity and extract purchase and license data from contracts, finance, and expense reporting systems, offering an organization-wide view of agreements and expired software. Furthermore, it surfaces real-time usage and engagement statistics that can highlight redundant apps. Configurable rules and licenses enable admins to reclaim licenses automatically, and granular engagement logs — including charts that plot the number of engaged users, teams, and locations over time — make it easier to compare stats to industry benchmarks and to determine best practices that might boost productivity.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The endgame is to empower companies to make profit-boosting rightsizing decisions from app analytics, Shapiro says. Rather than just seeing that a division has, say, 300 Dropbox licenses and that 40 percent of team members uploaded files to Dropbox folders this past fiscal quarter, CIOs can drill down into the productivity impact of those licenses and estimate the potential cost savings of choosing not to renew them.
“[One] of our customers had thousands of employees using three different SaaS storage applications,” the Productiv team explains. “Different teams and locations had separately purchased Box, Dropbox, and SharePoint over the years. The company was paying for far more licenses than they needed … [and] Productiv allowed [it] to understand which storage app to consolidate and standardize on, which … improved everyone’s ability to share and collaborate, reducing the time they spend working between applications, and increasing productivity.” Palo Alto, California-based Productiv is launching the service today in early access.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,756 | 2,019 |
"Productiv raises $20 million to track SaaS engagement | VentureBeat"
|
"https://venturebeat.com/2019/11/06/productiv-raises-20-million-to-track-saas-engagement"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Productiv raises $20 million to track SaaS engagement Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Enterprise software-as-a-service (SaaS) adoption has never been higher. Companies use 16 SaaS apps on average, and 73% say nearly all of their apps will be SaaS by 2020, driving the global industry to an estimated $157 billion in the next year. But coinciding with this climb is a decline in app usage transparency. A recent survey of IT leaders conducted by Numerify found that 45% don’t have a complete picture of key apps and business health services, with 57% saying they lacked an overview of IT performance across projects and employees.
The market’s relative opaqueness motivated Jody Shapiro, formerly head of product management for Google Analytics, to investigate a metrics-driven solution. Unable to find one, he developed his own in Productiv , which today announce the close of $20 million in series B financing led by Norwest Venture Partners, with additional participation from strategic investor Okta Ventures and existing investor Accel. It brings the company’s total raised to nearly $30 million, following an $8 million series A.
Since launching in April 2019, Productiv says it’s signed up clients like Tricentis, Entelo, Cheetah Digital, Equinix, Fox, HashiCorp, Uber, LiveRamp, and Blue Diamond Growers. Additionally, it now counts among its growing employee roster engineers from Slack, eBay, Facebook, and Netskope.
“SaaS has democratized enterprise application purchasing and made everyone a buyer, with multiple teams using multiple applications simultaneously. Redundancy is high and productivity is low, with employees sometimes checking five different tools to access one document. All of this creates unnecessary cost and friction among teams,” said Shapiro. “Productiv’s application engagement analytics address this wide-spread enterprise need, and the combination of today’s funding with our customer traction in the last six months is strong validation of our mission to provide enterprises with the insights they need to drive maximum value from their SaaS applications.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! To this end, Productiv’s cloud-based dashboard integrates with single sign-on tools to track login activity and extract purchase and license data from contracts, finance, and expense reporting systems. It surfaces real-time usage and over 50 different engagement dimensions that can highlight redundant apps, offering an organization-wide view of agreements and expired (or soon-to-expire) software. Configurable rules enable admins to reclaim licenses automatically, and granular logs — including charts that plot the number of engaged users, teams, and locations over time — make it easier to compare stats to industry benchmarks and to determine best practices that might boost productivity.
The endgame is to empower teams to make profit-boosting rightsizing decisions from app analytics, customer and head of IT at Uber Shobhana Ahluwalia says. Rather than determining whether a division has, say, dozens or hundreds Dropbox licenses and how many team members used those licenses in the past fiscal quarter, CIOs can drill down into the productivity impact and estimate the potential cost savings of choosing to cancel, upgrade, or downgrade service.
“Innovation is at the heart of Uber’s culture, and SaaS applications accelerate innovation by providing our employees a seamless collaboration experience no matter where in the world they are located,” said Ahluwalia. “Feature-level visibility into SaaS application engagement gives organizations a complete picture of how employees use applications to do their jobs, enabling them to focus adoption efforts on the applications that drive maximum value.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,757 | 2,015 |
"Pinterest open-sources Terrapin, a tool for serving data from Hadoop | VentureBeat"
|
"https://venturebeat.com/2015/09/14/pinterest-open-sources-terrapin-a-tool-for-serving-data-from-hadoop"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Pinterest open-sources Terrapin, a tool for serving data from Hadoop Share on Facebook Share on X Share on LinkedIn Pinterest is giving a presentation on Terrapin at Facebook's 2015 @Scale conference in San Jose on Sept. 14.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Pinterest today announced the availability of Terrapin, a new piece of open-source software that’s designed to more efficiently push data out of the Hadoop open-source big data software and make it available for other systems to use.
Engineers at Pinterest designed Terrapin as a replacement for the open-source HBase NoSQL database for this particular process, because HBase had proven slow and didn’t perform well beyond 100GB of data. The company looked at open-source key-value store ElephantDB as a possible alternative, but that wasn’t perfect, either.
“Terrapin provides low latency random key-value access over such large data sets, which are immutable and (re)generated in entirety,” Varun Sharma, an engineer on Pinterest’s core infrastructure team, wrote in a blog post on the news. “Terrapin can ingest data from S3, HDFS or directly from a MapReduce job, and is elastic, fault tolerant and performant enough to be used for various online applications at Pinterest, such as Pinnability and discovery data.” Pinterest has been using Terrapin in production for more than a year, and it’s now holding around 180TB of data, Sharma wrote. Now other companies will be able to try it out. It’s live on GitHub now.
Above: Terrapin includes a controller and a server. It works with existing tools like Apache Helix and Apache Thrift.
Many major web companies make internal tools they’ve developed available to the public. Just a few hours ago, for instance, Facebook released the React Native framework for Android under an open-source license. Facebook, Twitter, LinkedIn, and Airbnb have been quicker to open-source their tools than Pinterest. Typically, Pinterest opts to explain how it has built its software in extensive blog posts instead — so today’s news is notable.
Previous open-source releases from Pinterest include Pinball, PINCache, and Secor.
San Francisco-based Pinterest, which had a valuation of $11 billion in March , had 47 million monthly unique visitors in the U.S. in July, according to Statista.
Sharma’s blog post provides much more detail on Terrapin.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,758 | 2,016 |
"Pinterest open-sources its Teletraan tool for deploying code | VentureBeat"
|
"https://venturebeat.com/2016/02/12/pinterest-open-sources-its-teletraan-tool-for-deploying-code"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Pinterest open-sources its Teletraan tool for deploying code Share on Facebook Share on X Share on LinkedIn Pinterest is giving a presentation on Terrapin at Facebook's 2015 @Scale conference in San Jose on Sept. 14.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
As promised last year when the company introduced it, Pinterest today announced that it has released its Teletraan tool for deploying source code on GitHub under an open source Apache license.
“Teletraan is designed to do one thing, deploy code,” Pinterest software engineer Baogang Song wrote in a blog post.
“Not only does it support critical features such as zero downtime deploy, rollback, staging and continuous deploy, but it also has convenient features, such as displaying commit details, comparing different deploys, notifying deploy state changes through either email or chat room, displaying OpenTSDB metrics and more.” Above: Teletraan.
Teletraan works on Mac OS and Linux machines — no Windows support is available at this point, according to documentation for the tool. And while Teletraan can deploy code onto virtual machines, such as those available from public cloud Amazon Web Services, it’s not yet possible for it to deploy code into containers. But that capability is on the roadmap, Song wrote.
Like other web companies, Pinterest has been open-sourcing more and more of its software. For example, last year the company released tools for the Elixir programming language.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! When Pinterest introduced Teletraan in a blog post in May 2015, the company mentioned that it would open-source the software by the end of the year. It’s showing up behind schedule by a few weeks — but hey, at least it’s available now for people to check out and build on.
And this isn’t something from Pinterest’s virtual junkyard. Teletraan handles a whopping 500 deploys of code every day for the company, which had more than 100 million monthly active users as of September.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,759 | 2,021 |
"Oracle updates cloud data warehouse to boost access for data analysts | VentureBeat"
|
"https://venturebeat.com/2021/03/17/oracle-updates-cloud-data-warehouse-to-boost-access-for-data-analysts"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Oracle updates cloud data warehouse to boost access for data analysts Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Oracle today updated its Autonomous Data Warehouse to enable data analysts to load, transform, and generate insights from data with no intervention on the part of an internal IT team required.
The latest update to the Oracle data warehouse cloud service also enables data analysts to automatically create business models and discover patterns, along with providing a set of tools for preparing data and building machine learning models guided by AutoML, a set of open methods and processes for building AI models.
Other capabilities that have been added include support for the Python programming language, cognitive text analytics, graphs that can be invoked using a set of visualization tools, and an ability to deploy and manage native in-database models and ONNX-format classification and regression models outside of the core database.
The goal is to make it simpler for both professional and citizen data analysts to access data whenever they want using a fully autonomous platform, said George Lumpkin, VP of product management for Oracle. “We’re trying to provide what a cloud data warehouse should be,” he said.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! As the provider of a data warehouse that is widely employed in the enterprise, Oracle is attempting to fend off increased competition from cloud service providers such as Amazon Web Services (AWS), Microsoft, Google, and Snowflake. Oracle alternatively makes available data warehouse platforms from a single vendor that can be deployed in both its cloud and on-premises IT environments at a time when the bulk of enterprise data continues to reside in local datacenters.
Rather than requiring organizations to make a wholesale shift to the cloud, Oracle enables them to make that transition at their own pace, Lumpkin said.
In contrast to rival cloud data warehouses, Oracle has built its approach around a managed service that eliminates the need to dedicate IT professionals to managing, securing, and maintaining a cloud platform, Lumpkin added.
Oracle also provides access to a low-code Oracle APEX (Application Express) tool that makes it possible for both “citizen integrators” and professional developers to build applications that can be deployed via REST application programming interfaces (APIs), Lumpkin added.
It’s too early to say to what degree business units within organizations might be willing to enable data analysts and scientists to access, manage, and analyze data without any oversight from a central IT function. However, as IT becomes increasingly automated, it’s apparent that many of the manual data management tasks that used to require an IT professional are falling by the wayside. The time when end users had to wait days for an IT professional to set up am SQL query to generate a report is all but over as self-service tools become more widely available.
In effect, Oracle is making a case for transferring data management tasks to its platform. Less clear is to what degree that might lower the total cost of IT for companies. It’s worth remembering that as data becomes more accessible, usage will increase so organizations may wind up spending more on analytics. The difference is that they will hopefully be able to derive more business value from data that has become easier to interrogate in near real time.
In the meantime, the odds that most organizations will migrate all their relevant data to the cloud anytime soon is low. In fact, most organizations will be managing multiple data warehouses for years to come. The challenge will be determining what type of data needs to reside where based on use cases that are becoming more varied with each passing day.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,760 | 2,019 |
"GitGuardian raises $12 million to find sensitive data hidden in online code | VentureBeat"
|
"https://venturebeat.com/2019/12/04/gitguardian-raises-12-million-to-find-sensitive-data-hidden-in-online-code"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages GitGuardian raises $12 million to find sensitive data hidden in online code Share on Facebook Share on X Share on LinkedIn GitGuardian founders Jérémy Thomas and Eric Fourrier Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
GitGuardian , a cybersecurity platform that helps companies detect sensitive data hidden in public and private code repositories, has raised $12 million in a series A round of funding led by London-based Balderton Capital, with participation from GitHub cofounder Scott Chacon and Docker cofounder Solomon Hykes.
Founded out of Paris in 2017, GitGuardian scans all GitHub public activity in real time to identify private data, such as database login credentials, API keys, cryptographic keys, and more. The company works with over 200 API providers, spanning payment systems, cloud services, messaging apps, crypto wallets, and more to ensure that any private information that does leak into the public domain is swiftly identified and the company is notified. The French startup said it has sent out more than 400,000 alerts since its inception.
Secret sauce The type of private data GitGuardian is looking to protect is what is known in the industry as “secrets” and includes anything that can be used by unauthorized third parties to access a system (e.g. a cloud or database) — including passwords and API tokens.
Behind the scenes, GitGuardian links GitHub-registered developers with their companies and scans content covering 2.5 million code commits each day in an effort to find usernames and passwords, database connection string keys, SSL certificates, and more. The company said it uses “sophisticated pattern matching” and machine learning techniques, with its algorithm constantly learning through a developer “feedback loop.” In effect, GitGuardian’s clients help improve the technology by telling it whether an alert was valid or not.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Although monitoring public GitHub repositories is a major facet of GitGuardian’s offering, it also works to identify sensitive information that is inadvertently disseminated through internal systems, including private code repositories and message apps. Even companies that are careful to keep their code under lock and key can come unstuck if too many people inside an organization have access to it — the more people with access to “secrets,” the more avenues there are for data to become compromised. This is what is commonly referred to as “secret sprawl.” “Secrets that are made too widely accessible in an organization [are] a huge issue for security professionals,” GitGuardian cofounder and CEO Jérémy Thomas told VentureBeat. “In the case of source code, if there are secrets in it, it takes only one developer account to be compromised for all the secrets they had access to to be compromised as well.” Above: GitGuardian dashboard Breaches Back in 2017, Uber announced a major data breach that exposed the personal data of millions of riders and drivers. The company later confessed it wasn’t using multifactor authentication on its GitHub account — meaning anyone who encountered the login credentials could access its private repositories unhindered — and it was through the GitHub repository that the intruders managed to find access keys for Uber’s AWS data store, where its user data was kept.
In a Federal Trade Commission (FTC) filing from 2018, Uber revealed how the intruders managed to gain access to the private GitHub repository in the first place. Uber had granted its engineers access to the private repositories via their own personal GitHub accounts, which had weak security. The filing noted: Uber granted its engineers access to Uber’s GitHub repositories through engineers’ individual GitHub accounts, which engineers generally accessed through personal email addresses. Uber did not have a policy prohibiting engineers from reusing credentials, and did not require engineers to enable multi-factor authentication when accessing Uber’s GitHub repositories. The intruders who committed the 2016 breach said that they accessed Uber’s GitHub page using passwords that were previously exposed in other large data breaches, whereupon they discovered the AWS access key they used to access and download files from Uber’s Amazon S3 Datastore.
As a result, the intruders accessed 16 files that contained unencrypted personal data, including nearly 26 million names and email addresses, 22 million names and mobile phone numbers, and 607,000 names and driver’s license numbers.
Poor password hygiene aside, Uber’s AWS access key should probably not have been anywhere near a GitHub repository — private or otherwise — in the first place. This kind of breach highlights what’s at stake for companies. Compromising customer data and losing trust is a major issue, but poor security can also lead to regulatory and legal tussles.
“Hardcoding secrets in source code or other private site[s] that are not specifically meant for secret storage breaks various compliance rules and industry standards and best practices,” Thomas noted.
Uber, which initially covered up its gargantuan leak, was widely viewed to have violated numerous data security and breach reporting laws, and it eventually settled the case by paying a $148 million fine. This is the type of scenario GitGuardian said it can help avert, as it claims it can detect and send an alert to the developer and security team within four seconds of a secret leaking into code repositories.
“Currently, every company with software development activities is concerned about secrets spreading within the organisation, and in the worst case, to the public space,” Thomas said. “As a company with so much sensitive information at hand, we have built a culture of unconditional secrecy at our core.” GitGuardian said it has already helped more than 100 of the Fortune 500 companies, government organizations, and thousands of individual developers. And with another $12 million in the bank, it plans to expand its customer base in the U.S., where 75% of its current clients are based.
Some 40 million developers use GitHub, and with more than 100 million repositories , the Microsoft-owned code collaboration platform is fertile ground for any company looking to train algorithms. A few months back, Swiss startup DeepCode raised $4 million for a system that learns from GitHub project data to give developers automated code reviews. GitGuardian is adopting a similar philosophy in terms of how it’s using GitHub to train algorithms at scale so companies can further automate their cybersecurity setup.
“Rather than encumber technology organisations with limiting compliance procedures, GitGuardian allows the modern enterprise to develop code quickly and how it wants to, but with automated visibility and protection over how data, credentials, and other sensitive information is used, moved, and shared,” said Balderton Capital partner Suranga Chandratillake.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,761 | 2,020 |
"GitHub launches code scanning to unearth vulnerabilities early | VentureBeat"
|
"https://venturebeat.com/2020/09/30/github-launches-code-scanning-to-unearth-vulnerabilities-early"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages GitHub launches code scanning to unearth vulnerabilities early Share on Facebook Share on X Share on LinkedIn A GitHub logo seen displayed on a smartphone.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
GitHub is officially launching a new code-scanning tool today, designed to help developers identify vulnerabilities in their code before it’s deployed to the public.
The new feature is the result of an acquisition last year when GitHub snapped up San Francisco-based code analysis platform Semmle ; the Microsoft-owned code-hosting platform revealed at the time that it would make Semmle’s CodeQL analysis engine available natively across all open source and enterprise repositories.
After several months in beta , code scanning is now rolling out to all developers.
Breaches It’s estimated that some 60% of security breaches involve unpatched vulnerabilities. Moreover, 99% of all software projects are believed to contain at least one open source component, meaning that dodgy code can have a significant knock-on impact for many companies.
Typically, fixing vulnerabilities requires a researcher to first find the vulnerability and disclose it to the repository maintainer, who fixes the issue and alerts the community, who then update their own projects to the fixed version. In a perfect world, this process would take minutes to complete, but in reality it takes much longer than that — it first requires someone to find the vulnerability, either by manually inspecting code or through pentesting , which can take months. And then comes the process of finding and notifying the maintainer and waiting for them to roll out a fix.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! GitHub’s new code-scanning functionality is a static application security testing ( SAST ) tool that works by transforming code into a queryable format, then looking for vulnerability patterns. It automatically identifies vulnerabilities and errors in code changes in real time, flagging them to the developer before the code goes anywhere near production.
Above: GitHub: Vulnerability found Fixes Data suggests that only 15% of vulnerabilities are fixed one week after discovery, a figure that rises to nearly 30% within a month and 45% after three months. According to GitHub, during its beta phase it scanned more than 12,000 repositories more than 1 million times, unearthing 20,000 security issues in the process. Crucially, the company said that developers and maintainers fixed 72% of these code errors within 30 days.
There are other third-party tools out there already designed to help developers find faults in their code. London-based Snyk, which recently raised $200 million at a $2.6 billion valuation, targets developers with an AI-powered platform that helps them identify and fix flaws in their open source code.
This helps to highlight how automation is playing an increasingly big role in not only scaling security operations, but also plugging the cybersecurity skills gap — GitHub’s new code-scanning smarts go some way toward freeing up security researchers to focus on other mission-critical work. Many vulnerabilities share common attributes at their roots, and GitHub now promises to find all variations of these errors automatically, enabling security researchers to hunt for entirely new classes of vulnerabilities. Moreover, it does so as a native toolset baked directly into GitHub.
GitHub’s code scanning hits general availability today, and it is free to use for all public repositories. Private repositories can gain access to the feature through a GitHub Enterprise subscription.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,762 | 2,021 |
"Doppler expands secrets management to the enterprise with $6.5M in funding | VentureBeat"
|
"https://venturebeat.com/2021/03/04/doppler-expands-secrets-management-to-the-enterprise-with-6-5m-in-funding"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Doppler expands secrets management to the enterprise with $6.5M in funding Share on Facebook Share on X Share on LinkedIn Doppler's command line interface (CLI) Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Doppler , a universal secrets management platform for developers, has raised $6.5 million in a round of funding led by Alphabet’s venture capital arm GV.
Alongside the funding, Doppler introduced a tool called Doppler Share , which is designed to enable teams to share one-off secrets such as lockbox codes or Wi-Fi passcodes through an expiring link. Perhaps more interestingly, Doppler is also debuting a handful of enterprise features as it prepares to target bigger businesses.
“Secrets” is industry parlance for digital authentication credentials such as passwords, keys, and API tokens that protect access to applications, services, and other private areas of a company’s IT infrastructure. “Secrets management” refers to the tools and technologies companies such as Doppler offer to help businesses manage these sensitive areas.
“If you’re a database admin, would you leave a post-it note on your desk with the username and password to a database scribbled on it, even if you thought only your fellow admins would see it?” Doppler founder and CEO Brian Vallelunga asked. “I’m pretty sure you’d opt for a safe and secure place — one that only authorized users could access. This is exactly what Doppler provides for modern applications.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Doppler’s dashboard effectively groups secrets together by application, with the main Doppler command-line interface (CLI) serving up programmatic access to secrets when the application starts up. “No sensitive information ever touches the server,” Vallelunga added.
Above: Doppler dashboard Enterprising Founded out of San Francisco in 2018, Doppler had previously raised $2.3 million via a seed round of funding from notable investors, including Sequoia Capital, which also participated in this latest round, and Peter Thiel.
It’s still early days for Doppler in terms of working with enterprises, but the company has been developing a bunch of new features to help it do just that. “We are in talks with a number of enterprises,” Vallelunga explained. “Very recently, we started rolling out our enterprise-focused features.” Among the company’s new “enterprise-grade” features are token management, secrets referencing , cloud provider integrations, and SCIM (system for cross-domain identity management) support.
There are several secrets management tools already on the market, such as Amazon’s AWS Secrets Manager and Vault from HashiCorp, a startup now valued at more than $5 billion.
However, Doppler wants to cater to all developer environments equally, from local development to production, spanning every stack and project.
“It’s designed for developers while meeting the needs of security and DevOps (developer operations), with integrations for every major platform, including other secrets managers,” Vallelunga said. “Customers find us when they realize they’ve been managing secrets insecurely, get burned by secrets sprawl, hit limitations with their own home-grown solutions, or are looking for a more frictionless alternative to their current secrets manager.” Moreover, Doppler is going all-in on integrations , with users able to set up direct integrations with most of the major cloud platforms, such as Heroku, Vercel, Netlify, and GitHub Actions, not to mention other secrets managers, such as AWS Secrets Manager and Azure Vault.
“This is where Doppler excels,” Vallelunga said. “We absorb as much of the complexity around supplying app config and secrets as possible. We want our customers to focus on shipping new features and improvements.” This is the crux of Doppler’s promise to developers: Concentrate more on what you do best (coding) and less on mundane secondary tasks, such as managing secrets.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,763 | 2,021 |
"GitGuardian data reveals 20% rise in 'secrets' hidden in public GitHub repos | VentureBeat"
|
"https://venturebeat.com/2021/03/09/gitguardian-data-reveals-20-rise-in-secrets-hidden-in-public-github-repos"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages GitGuardian data reveals 20% rise in ‘secrets’ hidden in public GitHub repos Share on Facebook Share on X Share on LinkedIn Programmer working at 2 screens.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
GitGuardian , a cybersecurity platform that helps companies find sensitive data hidden in code, has revealed that it found more than 2 million “secrets” in public GitHub repositories in 2020, a 20% increase on the previous year.
Founded out of Paris in 2017, GitGuardian serves to “prevent hackers from using GitHub as a backdoor to your business,” as the company puts it, by scanning public GitHub repositories in real time to identify any private data bad actors could use to access their systems (e.g. a cloud or database), such as API or cryptographic keys, login credentials, and more.
GitGuardian’s inaugural State of Secrets Sprawl on GitHub report is based on its constant monitoring of every commit pushed to a public GitHub repository. Comparing data from last year against the corresponding period in 2019 showed that the number of secrets detected had grown by a fifth.
The “secrets sprawl” GitGuardian’s report refers to is essentially authentication credentials stored in lots of different places, making it hard to track.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “We think the growth is due to two factors — the increase of GitHub usage and the move toward cloud architectures and componentization,” GitGuardian CEO Jeremy Thomas told VentureBeat. “These two trends generate more digital authentication credentials.” Evidence suggests there is some truth to these assertions. With regards to GitHub usage, GitHub’s own data indicated that people collaborated more last year as open source project contributions jumped by over 40% in the months following lockdown. And a Red Hat study released last week found that enterprises upped their open source game in 2020.
More and more companies are shifting from monolithic on-premises software to the cloud and a microservices-based software architecture. But while applications built on smaller, function-based components that connect via APIs may be easier to develop and maintain, the culmination of all this digital transformation is that developers have a growing amount of sensitive data to manage.
GitGuardian, which raised a $12 million tranche of funding in 2019 from backers such as GitHub cofounder Scott Chacon, is one of a number of players operating in the secrets detection and management space. A few weeks back, Israeli startup Spectral exited stealth with $6.2 million to find costly security mistakes buried in code, while last week Doppler expanded its cloud-hosted secrets manager to the enterprise with $6.5 million in funding.
Human condition Platforms like GitGuardian are looking to fix problems stemming from human error, which is likely to increase as a company hires more developers. Error rates are also compounded by shortened release cycles.
But high-profile data breaches have put companies under increasing pressure to shore up their defenses. A few years back, Uber revealed a major breach that exposed the personal data of millions of users. Several security shortcomings were in play, but the root cause was that the hackers found an AWS access key in a GitHub repository belonging to an Uber developer. The hackers then used that key to access files from Uber’s Amazon S3 Datastore. This incident illustrates how important it is to safeguard secrets.
GitGuardian’s report shows that 85% of the 2 million secrets it found were in developers’ personal repositories, which fall outside of corporate control. “What’s surprising is that a worrying number of these secrets leaked on developers’ personal public repositories are corporate secrets, not personal secrets,” Thomas added.
This means a company’s internal systems could be vulnerable due to sensitive data hidden in current or former developers’ repositories. But it also shows how the problem can impact companies, regardless of whether they work on open source projects or not, as they have little visibility or control over how their developers use GitHub.
“Organizations can’t control what developers do with their personal GitHub projects,” Thomas explained. “GitHub is a fantastic platform for developers to collaborate together, learn new skills, and showcase their work. Developers typically have one GitHub account that they use both for personal and professional purposes, sometimes mixing the repositories. Developers use GitHub as their LinkedIn — that’s why they need one account that is really tied to them and contains their work.” Digging further down into GitGuardian’s report shows that 27.6% of secrets found were access keys to Google accounts. Other common system secrets found offered access to development tools (15.9%), data storage (15.4%), messaging tools (11.1%), and cloud providers (8.4%). In terms of the top file extensions that contained secrets, Python accounted for 27.7%, followed by JavaScript (18.7%), environment variables files (9.6%), and JSON (7.5%).
GitGuardian’s best practice suggestions to avoid such scenarios include restricting API access and permissions, encouraging developers to not share secrets unencrypted in messaging systems such as Slack, and never storing unencrypted secrets in .git repositories.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,764 | 2,021 |
"GitHub boosts developer productivity with new mobile notification controls | VentureBeat"
|
"https://venturebeat.com/2021/03/30/github-boosts-developer-productivity-with-new-mobile-notification-controls"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages GitHub boosts developer productivity with new mobile notification controls Share on Facebook Share on X Share on LinkedIn GitHub: Granular push notifications Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
GitHub is rolling out a handful of new updates to its mobile and desktop apps, including “enhanced” push notifications with more granular controls and the ability to pause them altogether.
The Microsoft-owned code-hosting platform said the update is part of its push to support the burgeoning hybrid and remote workforce , which relies on asynchronous communications. Nicole Forsgren, VP of research and strategy at GitHub, recently wrote about developer productivity in a co-authored article published in ACM Queue.
The paper notes that ensuring efficient software development and the well-being of developers has “never been more important,” with the rapid shift to remote work creating a potential disconnect between developers and their usual workspaces and teams.
“This forced disruption and the future transition to hybrid remote/colocated work expedites the need to understand developer productivity and well-being, with wide agreement that doing so in an efficient and fair way is critical,” the coauthors wrote.
Going mobile GitHub launched its mobile app for Android and iOS a year ago, but at the time it only supported push notifications for messages that include a direct mention of the developer — and with good reason.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Push notifications [were] one of the very first features we added via a cross-team hack week with the GitHub notifications team,” Ryan Nystrom, senior director of engineering at GitHub, told VentureBeat. “From that work, we created early versions of pushes for any type of activity, but we knew that without controls this could overwhelm users. “Notification fatigue is real, so we decided to start at a very high signal with lower volume through the initial direct mentions notifications.” In other words, developers could end up drowning under a deluge of alerts, particularly when they’re supposed to be offline. And so over the past year, GitHub has been taking on feedback from developers to figure out what additional notifications and controls could help them manage their time and productivity. With this latest update, developers can toggle push notifications on and off not only for when they’ve been directly mentioned, but when they’ve been asked to review a pull request, assigned a task, or asked to approve a deployment for a protected branch.
Above: GitHub: Push notification settings This is important because a manager or senior developer might need to approve key stages in a project when they’re on the move or otherwise not at their desktop.
“One of the core principles of the mobile app is that we’re helping unblock people,” Nystrom said. “Deploy approvals are a new flow for GitHub — for developers using GitHub mobile, we knew immediately it’d be valuable to get notified when your review is requested so you can unblock a deploy without the need to be at your computer.” Above: GitHub: New push notification controls Related to this, GitHub for mobile also now lets developers set custom working hours, meaning users can specify when push notifications will be sent to their phone.
Above: GitHub: Custom working hours This fits a push across the technology spectrum to foster a healthier work-life balance — Google, for example, rolled out “focus mode” in 2019 to help users minimize and control alerts on their mobile devices.
Elsewhere, the GitHub mobile app now lets developers view releases natively inside the app, rather than linking the user through to a web view. “This was also one of our most-requested features,” Nystrom added.
Above: GitHub for mobile now shows release notes natively Along similar lines, GitHub users can also now customize their repository “watch” settings from mobile. Much as it works on the browser version, they can now opt in to a very specific subset of actions they’d like notifications for in their inbox, such as issues, pull requests, releases, and discussions.
Over in the desktop realm, GitHub launched version 2.7 of its desktop app that makes it easier for developers to copy individual or multiple commits between branches (known as “cherry-picking”) using a drag-and-drop tool.
Above: GitHub desktop app: “Cherry-picking” by drag-and-drop According to GitHub staff engineering manager Billy Griffin, developers would previously have to go to the command line and look up the Git cherry-pick documents to remember the correct syntax to copy the commits, but the drag-and-drop option makes this process more visual and intuitive.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,765 | 2,018 |
"Digital detox: The rise of the stripped-down secondary phone | VentureBeat"
|
"https://venturebeat.com/2018/10/19/digital-detox-the-rise-of-the-stripped-down-secondary-phone"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Feature Digital detox: The rise of the stripped-down secondary phone Share on Facebook Share on X Share on LinkedIn Docomo: Card Keitai KY-01L Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
This was something of an odd week in the mobile phone world.
On Tuesday, Chinese smartphone giant Huawei unveiled what is arguably one of the most feature-rich flagship Android devices yet.
The day before, former PDA darling Palm emerged from oblivion with a new device — after Chinese electronics company TCL, which also operates the BlackBerry brand in mobile phones, acquired the Palm brand from HP back in 2014.
Above: The Palm is a phone for your phone.
The Palm is a Verizon-only device that is pitched as a secondary phone you can seamlessly use with your regular phone number, but in situations that may not require the full functionality of a large-screened premium phone.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The Palm is small — about the size of a credit card — and although it is capable of running the usual array of Android apps, it’s not designed with that in mind. Its marketing spiel says: Never miss out on the world around you. Life Mode is a unique Palm feature that lets you stay connected while minimizing digital distractions. It puts you in control of what gets your attention so you can stay connected and present in the moment.
“A phone for your phone,” is another way of putting it, which highlights how bonkers the idea truly is.
Yesterday, another companion phone emerged on the market , this time from Japanese telecom giant Docomo. The Card Keitai KY-01L is legitimately designed with digital detoxers in mind and features an e-paper touchscreen and functionality limited to calls, text, and basic internet. It doesn’t work with mobile apps and has no camera.
Above: Card Keitai KY-01L A quick glance across the mobile phone spectrum identifies a marked rise in such devices.
Last month, Swiss firm Punkt Tronics unveiled the MP02 , the first non-BlackBerry phone to use BlackBerry’s security smarts.
Above: Punkt: the MP02 The pitch? The MP02 is designed for people who want to unplug from the world of apps and constant notifications and simply use a well-designed phone for ultra-secure calls and texts.
Earlier this year, New York-based Light successfully crowdfunded its second phone, the Light 2 , which again is all about tuning out the digital world.
Above: The Light 2 So what, exactly, is pushing this trend? A cynic might say it’s a symptom of an oversaturated market, with manufacturers seeking additional inroads to your pocket — “You’ve got a smartphone? Cool — you need this more basic phone too!” But digging deeper, it seems the appearance of these secondary “stripped down” phones is being driven by growing concerns that we spend too much time immersed in digital worlds.
Mental health Smartphones, and the “always on” connectivity they enable via myriad services, such as Facebook, Twitter, and Instagram, are increasingly being blamed for a decline in mental health. More specifically, smartphone use has been directly associated with a rise in depression and teen suicides.
Such reports are not lost on technology companies. The two big players in the smartphone operating system world, Google and Apple, have recently integrated digital well-being features into Android and iOS, respectively , encouraging users to analyze the time spent on their phones and consider cutting back.
Fearing a technology backlash, both Facebook and its Instagram subsidiary have added tracking tools to help you limit your time on social media.
But these features don’t really get at the root of the problem, and they are easy to ignore even when they are available. What some people really want is a device with inherent functional limitations.
None of the “minimalist” devices I mention above stand much chance of creating more than a ripple. The Palm, for example, doesn’t really work as a concept, because it still offers almost complete connectivity to all the usual apps. Sure, it’s more portable and discreet, but you will probably end up just getting frustrated by the small screen and fiddly features.
Docomo, Punkt, and Light are certainly on the right track — they are building funky little phones that will appeal to some, but the problem is they are just too basic.
Finding the perfect secondary phone Last year, I invested in a Nokia 3310 , a sort of modern-day version of the original Nokia 3310, but with a slicker look and 3G connectivity.
The plan wasn’t to replace my main phone, but to use it as a secondary device and perhaps wean myself off constant connectivity. I was looking for something I could use to keep in contact, but that wouldn’t bombard me with messages and notifications or tease me with the latest highlights reel from the weekend’s soccer.
The Nokia 3310 seemed perfect.
Above: HMD Global: Nokia 3310 survey I have used it on occasion, but it’s ultimately no good. You see, it doesn’t have WhatsApp — the Facebook-owned service is pretty much the focal point of many social interactions today, making it impossible to live without for long.
Also, the Nokia 3310 doesn’t have mapping software or a decent camera.
And herein lies the problem facing any company seeking a sizable market for secondary mobile phones.
Amidst all the hullabaloo warning that “smartphones are bad for your health,” it’s easy to forget just how integral they are to our everyday lives. Nobody owns dedicated mp3 players these days, nobody lugs an A-to-Z city guide or bus timetable around with them, and nobody carries a dedicated camera. And, crucially, nobody really uses SMS all that much anymore.
Getting people to give up their main premium phone is a tall order, but if someone can design a minimalist secondary phone that offers some of the basic assets of a smartphone, that device stands a chance of finding a decent market share.
So what does the perfect secondary phone look like? The answer is naturally subjective, but I can lay out some general guidelines.
Concept companion phone Given that the main app-makers generally only cater to Android and iOS, the concept phone I have in mind would probably have to be Android. It shouldn’t be able to run an app store, such as Google Play, so it would probably have to be a forked version of Android that allows more customizations (though there may be more flexibility with this in Europe ).
At an absolute minimum, the ideal phone should come preloaded with WhatsApp, or other messaging apps depending on the market. Ideally, it would also run a maps and navigation app, such as Here or Google Maps, if licensing permits. There is probably scope to open things up to a ride-hailing service such as Uber, while Spotify would also be welcomed by many.
It should also have a decent camera and a built-in music player for those, like me, who still have vast mp3 collections.
Arguments could be made for other apps, but there is a danger of losing sight of the original goal: to tune out while being able to access the basic utilities we have come to rely on from our smartphones. I do rely on Gmail, Google Search, Slack, Citymapper, Netflix, Words With Friends (LOLZ), and others, but I can certainly live without them for a weekend.
While Light’s second phone currently offers very basic functionality, the company has previously said it is considering introducing maps, ride-hailing, music playlists, weather, and other utilities — though it has stated emphatically that it will never support email, social media, or news.
Above: Light Phone 2: Got, might get, not getting It’s also worth noting here that WhatsApp recently landed on the KaiOS-powered JioPhone, an advanced 4G feature phone. But it is only available in India, and it probably includes too many other internet-related distractions to really fit the bill.
For this concept to really work, it would ideally offer a way to use your main phone number on the secondary device, as Palm enables. Requiring a consumer to use two phone numbers adds unnecessary friction, and will likely mean they will end up not using the secondary phone nearly as much. But this introduces additional problems, insofar as WhatsApp can really only be used on one device at a time with the same number. So maybe the ideal device I’m looking for would have to be my primary device, with a more feature-rich smartphone serving as my secondary unit.
Or, if I’m being honest, this is perhaps where my whole idea starts to fall apart.
Given recent activity, one thing is clear, though: There is demand for stripped-down companion phones that don’t require users to permanently give up their main device, and manufacturers are starting to pay attention. But there’s still a great deal of room for iteration and innovation.
The phone cannot be too basic, but at the same time it can’t betray the fundamental reason for its existence. What the perfect companion phone looks like is up for debate, but whoever nails it could find a line of eager consumers ready and waiting.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
15,766 | 2,019 |
"Android's Focus mode could be a game-changer for anyone struggling with constant distractions | VentureBeat"
|
"https://venturebeat.com/2019/12/05/androids-focus-mode-could-be-a-game-changer-for-anyone-struggling-with-constant-distractions"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Opinion Android’s Focus mode could be a game-changer for anyone struggling with constant distractions Share on Facebook Share on X Share on LinkedIn Android's new Focus Mode: How it works Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship.
Learn more.
Following an extended beta period, Google officially launched a new well-being feature for Android users yesterday, and it could prove a game-changer for anyone looking to switch off and create a healthier work-life balance.
Focus mode, as the feature is called, is part of Android’s built-in Digital Wellbeing toolset, which offers a number of features to help people manage their screen time more effectively, including app-limit timers and a “wind down” mode that encourages users to set a bedtime schedule. In a nutshell, Focus mode allows users to choose the most distracting apps on their device and “pause” them all with the toggle of a switch. For anyone who struggles to compartmentalize their app usage and is easily distracted by constant alerts, it holds the potential to be an incredibly powerful feature.
The need to switch off A 2017 report suggested that nearly 60% of U.S. businesses allow their employees to use their personal devices for work purposes, while Samsung’s 2018 The State of Enterprise Mobility research report found that only 17% of enterprises provided their workforce with corporate mobile phones and 31% relied entirely on a bring-your-own-device (BYOD) approach. The remaining 52% adopted a hybrid ethos, where some workers received corporate devices, depending on their seniority or role.
For most people, carrying around two devices with the exact same functionality doesn’t make a lot of sense, so it’s easy to see why BYOD has caught on. One notable consequence, however, is that it has become increasingly difficult for workers to switch off once they leave the office. When they open WhatsApp to let their spouse know they’re heading home, it’s all too tempting to reply to the work email that’s just popped up in the notification bar. Google has for a while offered a work profile feature for Android Enterprise users that allows them to easily deactivate all their work apps, but for smaller businesses or self-employed workers, this is likely not an option.
Moreover, amid growing concerns about the harmful effects of smartphone and social media addiction — which may have contributed to an increase in depression and loneliness, particularly in young people — giving users better tools to control their smartphone usage is a step in the right direction.
While it’s already possible to deactivate notifications for specific apps, the time it takes to do so for each one individually (and remember to turn them all back on) means most people don’t bother. This friction is precisely what Android’s Focus mode aims to solve.
Here’s a quick overview of how it looks in its current guise.
In focus Focus mode is available on all Android devices that have Digital Wellbeing and parental control settings, which is apparently limited to Android 9 and Android 10. The feature can be activated through the main device settings menu by visiting Digital Wellbeing & parental controls >> Focus mode.
Additionally, you can create a shortcut in the Quick Settings menu, meaning all you need to do is drag down from the top and hit the Focus mode button to turn the feature on and off.
Above: Activating Android Focus mode To configure Focus mode for your needs, you will need to choose which apps you consider to be the most distracting. A typical use case might be workers who want to pause notifications from Gmail, Slack, and Twitter as soon as they leave the office and then reactivate them the next morning.
Above: Configuring Focus mode on Android However, it is easy to deactivate Focus mode permanently by hitting Turn Off Now, or momentarily by selecting Take a Break — the latter option allows you to turn off Focus mode for 5, 15, or 30 minutes. This is actually a new feature that has been introduced for the full launch based on user feedback from the beta phase.
Above: Focus mode: Take a break Any app that is actively muted will now be grayed out on your device, as you can see here with Gmail and Slack.
Above: Focus mode grays out apps that you’ve chosen If you try to open one of the silenced apps, you will see the message below. You’ll notice that you won’t have to go back to the main settings to temporarily deactivate Focus mode, as the Take a Break option is baked directly into the notification.
Above: Focus mode is on Another new feature that has been introduced for the full launch of Focus mode is scheduling, which allows users to automatically activate or deactivate Focus mode for set times and days.
For example, if you want to switch off all your personal notifications during work, you can set things up so that you don’t get any WhatsApp, Instagram, Facebook, or Strava alerts between 9 a.m. and 5 p.m. on weekdays.
Above: Focus mode scheduling Given the handful of new features Google added to Focus mode for its full public debut yesterday, it’s clear the company is open to feedback that might improve functionality. And after I played around with Focus mode over the past day, it became clear the tool could become even better with a few enhancements.
Different modes Automation is very much the name of the game in 2019, which is why the scheduling feature is particularly welcome. However, it could be better.
For example, you can set daily schedules (e.g. 9 a.m. to 5 p.m. on Monday), but you can’t set a schedule that spans several days. So if you want to deactivate all your work apps from 5 p.m. on a Friday until 9 a.m. on a Monday, you’re out of luck. You can work around this by creating a weekend schedule that starts at midnight on the Saturday morning and finishes at 11:59 p.m., and then apply that to the Sunday too, but it still doesn’t take into account the 5 p.m. to 11:59 p.m. window on Friday. Of course, you can manually turn Focus mode off if you remember to do so.
Above: Focus mode: Workaround for a weekend mode Focus mode would really benefit from the ability to create different modes. You could custom-build a Weekend Mode stipulating the exact times and days it should be active or create multiple modes for various use cases. For example, you might want to create a Gym Mode that blocks all apps apart from Spotify and WhatsApp, or a Cycling Mode that blocks everything other than Google Maps and Strava.
Back in October, Google introduced a handful of experimental well-being apps , one of which was Morph — which provides some clues to how Focus mode could be improved in the future.
Morph is actually a launcher app that changes the look of a user’s homescreen. It helps them focus by only showing apps that are relevant to the activity they are doing at a specific moment. The user creates different modes based on times or locations and selects which apps are visible accordingly. Using GPS, the app can automatically switch to Exercise Mode when the user enters the gym, for example, or it can switch to Non-work Mode when they get home.
Above: Morph launcher Kill the distraction Maura Thomas , a renowned author, trainer, and speaker on productivity and work-life balance, said that in her work with some 2,000 organizations, distraction is the “single biggest barrier to meaningful, satisfying work.” “We are constantly shifting our attention [between] trying to complete assignments and projects, tracking and responding to endless communications, and managing interruptions from colleagues and the office bustle,” she said. “Constant distraction leaves a trail of scattered thoughts and partly done tasks in its wake. It leaves us feeling overwhelmed and tired.” Focus mode may not offer a complete remedy to that problem, but it goes some way toward addressing the issue at an individual level. The rise of minimalist phones has been a notable trend over the past couple of years, but people rely on many of the services inherent to smartphones, which makes switching to stripped-down handsets something of a challenge. Features such as Focus mode could help address the underlying issue without requiring a major commitment.
It is worth noting that Focus mode is an entirely optional feature and very easy to override, which means anyone who is truly addicted to their smartphone might not benefit much from it. The decision to minimize the constant bombardment of alerts and notifications has to come from within — but having tools to do so more easily could be a start.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
Subsets and Splits
Wired Articles Filtered
Retrieves up to 100 entries from the train dataset where the URL contains 'wired' but the text does not contain 'Menu', providing basic filtering of the data.