id
int64
0
17.2k
year
int64
2k
2.02k
title
stringlengths
7
208
url
stringlengths
20
263
text
stringlengths
852
324k
3,613
2,022
"How long no-code product design trends could last | VentureBeat"
"https://venturebeat.com/programming-development/how-long-no-code-product-design-trends-could-last"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest How long no-code product design trends could last Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. How do you define a trend? Trends in the stock market indicate that finances are moving in a certain direction. Trends in the business world center on new strategies and practices that are potentially beneficial for a brand’s baseline. Trends in fashion change so swiftly that they’re difficult to keep up with. In defining what makes a “trend,” there’s one thing that always seems to be part of the equation: Trends are limited in popularity, and eventually, they give way to the new. No-code products themselves have been on the cusp of a new trend, part of the growing no-code movement that has fostered accessibility and user-friendliness for those of us who haven’t mastered coding. But will no-code products continue to be utilized, and for how long? Let’s take a look at some of the details, and the factors that play into the answers. No-code product design What exactly do I mean by no-code product design? VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Let’s take a simple example: An app that you might download on your phone is the product, and the way it was formed and intended to function constitutes the design. Using no-code applications allows a web designer who might not be a coding expert to take the individual elements of the app and put them together. And no-code products have a much more extensive reach than simply designing apps, though that’s one of their most popular uses. Anything that creates a unique design from pre-built elements — online logo-maker software, for example — may be no-code. Websites, programs, software, blogs, ecommerce sites, social media platforms (and more) can fit the category. Now, no-code design doesn’t automatically mean that the developer is free from all possibility of making design mistakes; a tool is only as good as its user. Products designed using no-code are still subject to updates, iterations and adjustments according to what is needed to preserve and enhance functionality. In the end, though, no-code is growing because it makes product development much more accessible and user-friendly. Growth of the no-code movement With more platforms depending on no-code availability, it isn’t difficult to see why there’s a growing general trend towards its use and development. And with more emphasis on no-code, we see more trends emerging that play into its further adaptation and usage. I already mentioned the use of no-code in app development. It’s also increasingly utilized in other aspects of the virtual world, and I can see it being incorporated extensively as the metaverse takes shape. One of the primary fields where I see no-code being most useful is branding — specifically, the development and launch of new brands. It’s never been easier to put together an online presence that accurately represents the personality of your company, and as a serial entrepreneur, that cause is close to my heart. How long no-code trends will last depends largely on the usability of the trends themselves. Since they’re all centered on the usefulness of no-code as a whole, I’d like to discuss them in that context. Here are some of the key factors that play into the longevity of no-code: Fast development Accessibility Budgeting Tools Ease of use Fast development One trend in no-code is its increased use in ecommerce and branding. This makes perfect sense, as it dovetails nicely with the current love for on-demand services and SaaS. Take an ecommerce site. Perhaps you’ve already launched your site with a number of products, but suddenly you have a new option. No-code design means you can simply copy and paste — or drag and drop — the same elements previously established within your site, ensuring continuity, and keeping the time from opportunity to availability as short as possible. Accessibility Not everyone can code. That’s basically the entire premise behind no-code — not everyone can code, and not everyone needs to, either. The rise of no-code has expanded what developers of all disciplines and experience levels can do. This in turn allows for greater freedom in product development, expansion of the struggling economy, and new initiatives. It’s much more feasible to pursue crafting a product when lack of coding knowledge doesn’t actively prohibit you from moving forward. No-code, in that sense, is the great leveler. Everyone with access to the internet has the ability to participate in the movement. Budgeting Coding can get expensive. This isn’t always because of the actual coding itself, but because man hours are required that can hit a development company’s budget right where it hurts. Building a site from scratch, for example, can cost $2,500 and up. For most small business launches, for example, websites are vital, but that amount of money simply isn’t there. Relying on no-code instead of from-scratch and bespoke coding can significantly reduce the finances required to get a product off the ground and into use. The toolbox It seems like every time I turn around, there’s a new tool that can fit handily into the no-code toolbox. From Webflow to Airtable, from Bubble to Coda, there’s a no-code platform to fit every need — and every design style. I applaud the variety of no-code tools, especially because in my experience as a serial entrepreneur, each product and each business venture requires a little bit of a different approach to ensure the best possible outcome. It may be a greater ability to tweak individual elements or having a wide range of drag-and-drops to choose from — it depends on the product in development. So no-code tools may come and go, but I believe we will continue to see them being developed as technology moves forward and makes no-code ever more adaptable. We like things to be easy Setting aside important considerations like time, budget, accessibility and everything else I’ve mentioned, one of the most telling factors in the lifespan of no-code product design trends is simply our own laziness. Maybe “laziness” isn’t the right word; after all, no-code developers are enthusiastically and energetically working at expanding their horizons. But for those of us who are using no-code to the full, at least a small part of it is fueled by our desire to keep things as simple and easy as possible. No-code methods do that for us. In a world that is increasingly complex, the option to simply drag and drop until we’ve created exactly what we need is like a wonderful dream. No-code product design trends: The takeaway The point of a trend is to provide a new and beneficial method to reach our goals. In that, the no-code movement itself is a trend that has stuck around in the sphere of tech and development. But do all trends fade and die, replaced by something new? In the years since no-code has begun to gain traction, the beneficial aspects of the concept certainly haven’t faded. Micro-trends continue within the no-code movement, but no-code itself — in my opinion — isn’t going anywhere. Meanwhile, technology continues to advance, innovation always has a next step, and we will never stop searching for ways to better ourselves and our products. No-code is here to stay, but I believe we’ll continue to develop more and more effective ways to utilize it in more arenas. No one can say exactly how long any particular no-code product design trend will last or what will come next. But with the advancements we’ve already seen, one thing is for sure: The future’s going to be exciting, and it may even be a whole lot easier to use. Tarif Kahn is head of design at LogoDesign.Net DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,614
2,023
"Predictions for gaming in 2023 | The DeanBeat | VentureBeat"
"https://venturebeat.com/games/predictions-for-gaming-in-2023-the-deanbeat"
"Game Development View All Programming OS and Hosting Platforms Metaverse View All Virtual Environments and Technologies VR Headsets and Gadgets Virtual Reality Games Gaming Hardware View All Chipsets & Processing Units Headsets & Controllers Gaming PCs and Displays Consoles Gaming Business View All Game Publishing Game Monetization Mergers and Acquisitions Games Releases and Special Events Gaming Workplace Latest Games & Reviews View All PC/Console Games Mobile Games Gaming Events Game Culture Predictions for gaming in 2023 | The DeanBeat Share on Facebook Share on X Share on LinkedIn This guy I met at The Game Awards 2022 is helping me with predictions. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Predictions are as difficult and foolish to make as ever. The year 2022 had a different rhythm than we expected, after two solid years of growth for gaming in 2020 and 2021. Still, it remained one of the most unpredictable in gaming history. As I learned during the pandemic, I now think about predictions with both a sense of hope and a sense of dread. The pandemic threw off our ability to predict what will happen life, and it did the same in the game industry. Game companies had a record year in 2020, and I wondered if it was a one-time bump thanks to the coronavirus forcing lockdowns. People played games to survive, repair their social lives, and distract themselves. And yet while it was hard to top 2020, the game industry grew in 2021, according to market researcher Newzoo. But for the first time since it began making predictions, Newzoo now says that the game industry will shrink 4.3% in 2022 to $184 billion, thanks to weakness in mobile games. The invasion of Ukraine caused havoc in the markets, followed by a crypto crash, high inflation, and global economic weakness. All of that took its toll. Some of this is easy to explain in hindsight. The consoles were in short supply. Big games were delayed. The global economic downturn and privacy initiatives hurt mobile games, while VR gained ground. And people started going back to other pursuits as they regained the ability to travel and go out in public as the pandemic entered a new stage. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! And while we saw a huge surge in game deals in 2021, we saw even more activity in 2022, with Drake Star Partners reporting that game deals hit $123 billion in value in the first nine months of the year — driven by the pending $68.7 billion acquisition of Activision Blizzard by Microsoft — compared to $71 billion in the first three quarters of 2021. As I said last year, I know that our compass, informed from the patterns of the past, is broken. But I’ll wager we can expect games to continue to outgrow other forms of entertainment. I expect that trends such as Hollywood transmedia, the metaverse, Web3, cloud gaming, esports, mixed reality, and other trends will come along to reinvigorate the game industry — as well as great new games in the core areas of the PC, console and mobile game industries. For the usual comparison and embarrassment, here are my predictions from 2022 , 2021 2020 , 2019 , 2018 , 2017 , 2016 , 2015 , 2014 , 2013 , and 2012. At the bottom of the story, I’ve also given myself grades for the predictions I made a year ago for 2022. I gave myself seven “A” grades out of 11 predictions, so I suppose it wasn’t a bad forecast. I’ve been very publicly calling for ideas on social media about my predictions, and I appreciate all of the followers and readers who have pitched ideas to me. Many of these ideas were adopted from my social media conversations. Some of these come from Chris Heatherly, Kate Edwards, Noah Falstein, Steve Peterson, Chris Akhavan, Edward Saatchi and others. Thank you for your help, and Happy New Year! Stay safe out there. My predictions for gaming in 2023: 1) Console and PC games will get stronger, but mobile games may get weaker with the economy I expect that growth will resume for the game industry as a whole. At some point during the year, PlayStation 5 and Xbox Series X/S consoles will regularly be in stock at most retailers. The PlayStation VR 2 will debut in February and it will give a mid-life boost to the PS5. Nintendo may finally announce a replacement for the Switch, spurred by competition from similar devices such as the Steam Deck. While Sony pretty much owned the holiday season of 2022, Microsoft should come back with a strong slate in 2023 with titles like Starfield, Redfall and more. Electronic Arts has Star Wars Jedi: Survivor and Dead Space. Suicide Squad: Kill the Justice League has a date, as does Hogwarts Legacy. Diablo 4 is coming, and Nintendo has The Legend of Zelda: Tears of the Kingdom. Square Enix has Final Fantasy XVI. Sony has Spider-Man 2. And the BioShock creator has Judas. The list goes on. As long as gamers can get their hands on new consoles, this list should drive demand. I’m not so confident about mobile gaming, which has some cool titles like Warzone Mobile coming. It is vast and unpredictable, but it is driven by free-to-play gamers who may be more sensitive to economic doldrums and less hardcore than PC and console fans. 2) The metaverse will not prove its worth in 2023, but it will continue to inspire The metaverse is going to take a long time to develop. If it is to meet our imaginations of delivering something close to sci-fi books and movies — something akin to the Star Trek Holodeck — it’s going to take time. We will start laying the foundations for the metaverse in 2023 with the launch of 10G cable networks and real-time interaction. But it will take years to deliver the creative experiences and games that will inspire the masses to join. There will be backlash against the metaverse, much like there is backlash against Web3 gaming and cryptocurrency. Noah Falstein calls this the “meta-averse” reaction, and it’s easy to foresee because no one is delivering on the metaverse for a while. But I believe that all industries will converge on a 3D internet, and when we create the tools and standards to maximize interoperability and reusability of assets across many worlds, we’ll finally have something that attracts the masses. And I expect Hollywood to keep stoking the inspiration with cool metaverse visions. 2) The pseudo-metaverse will start to emerge What we’ll get instead are metaverse experiences that are smaller in scale and adhere to defacto standards, rather than real ones. Do not be deceived. This will be somebody’s metaverse, but it won’t be the open metaverse. Back in March 2022, the Game Developers Conference released a survey that said 17% of game developers were working on a metaverse project, and 83% were not. If that survey were done again today, I think we’ll see a swing toward more projects in the works. Those projects will include work on gaming galaxies (not quite real open metaverses) like Roblox, Minecraft, Dreams and Fortnite. But there are likely more platforms that will emerge for the metaverse, and you can bet that game devs will seize the opportunity to create content on these platforms. That tells us that things are in the works, and more things are coming that will flesh out the metaverse. With luck, we’ll see how Apple makes its play in the metaverse (even though it won’t call it that) with the launch of its mixed reality headset for mobile devices. Meta’s Mark Zuckerberg has promised we’ll see the Meta Quest 3 VR headset emerge in 2023, and Sony may do well with it core gaming audience with PSVR 2. At first, we’ll see metaverse applications that are interoperable within a single company, and then we’ll see some alliances form. After standards get worked out, we’ll see something like a real open metaverse emerge — I hope. 3) Gaming will disrupt other industries, and it will be disrupted by outside forces Gaming has a good command on our leisure time and with each new generation of players there are more and more native gamers. Over the years, we have seen inexorable growth with generational change. The average age of a gamer is 33, according to the Entertainment Software Association. And 69% of American homes have at least one gamer. And 97% of Americans view games as beneficial in some way. People play an average of 13 hours a week, up from 12 hours in 2021. And so that is taking audiences away from other pursuits, and it’s why we see Hollywood making so many game-related entertainment. Game technology — such as the cloud and game engines — is infiltrating other industries. But this doesn’t mean that gaming will always grow. We saw how it was battered in 2022 by the Russian invasion of Ukraine, high inflation, high interest rates, two cryptocurrency disasters, and general economic malaise. Forces like government regulation in China and the closing off of markets like Russia have an effect. And COVID-19 and its aftermath had effects on gaming, such as suppressing in-person esports. I hope we don’t see more of the same in 2023, but we should expect this push and pull to continue. 4) Geopolitics will continue to interfere with global gaming Our hearts go out to the victims of geopolitics. The Russian invasion of Ukraine has taken so many lives, and it has also scattered Ukraine’s rich game development community. Some are fighting. Some fled the country. Some relocated and now work remotely, sometimes without a reliable electrical grid. Russia’s aggression has also inflicted wounds on its own game industry. Many game companies like Wargaming and My.Games have moved their developers out of the country. It’s fair to say that no one is investing in home-grown Russian game studios anymore. But the diaspora has provided much-needed talent for game companies with operations in other countries. Mytona moved its workforce out of Russia and has operations in places like Singapore and New Zealand. In China, government restrictions have affected gaming. Popular gaming cafes had to shut down, and game studios had to shift to remote work. Many studios closed down altogether. Restrictions on launching new games kept a lot of foreign games out of the market. And curbs on game-playing for young gamers took a toll on demand. The result was weakness for one of gaming’s biggest markets. And leaders like Tencent and NetEase focused on expansion in the West. Web3 game companies also saw a lot of regulations, some in reaction to scams. South Korean courts upheld laws prohibiting initial coin offerings. And censorship was alive and well in various jurisdictions in the world. Geopolitics will likely continue to stand in the way of a global game market. 5) AI will trigger big changes for games and game development We’re seeing a growing trend for AI characters as the main agents in games. Instead of playing as a character, humans will groom AI characters — like in Fable’s upcoming The Simulation — who will live in simulated game worlds and live their lives. In this came, humans are trainers, not players. Startups like Inworld AI plan to use the rapidly accelerating intelligence technology to make smarter non-player characters (NPCs) in games so that the engagement will seem so much more realistic. These games will draw us into the fantasy of gaming or give us compelling narratives that keep us involved. On top of that, technologies like generative AI, AI art and AI chat will change game development. The technology may threaten a lot of jobs associated with grunt work — with impacts in areas as varied as programming, user acquisition and art. Many veterans won’t like the impact it will have on labor, but it’s hard to fight the wave if the technologies bring efficiencies that alleviate the high costs of game development. It may be better to think of new ways to work by taking advantage of the technology. AI will help democratize game development, making it easier to do user-generated content. And it should make life easier for creators who make a living from celebrating game content. In fact, we may even see AI-based creators become celebrities in their own right. AI’s effects — like we’ve seen in many other industries — will likely be broad, deep and lasting. Don’t underestimate it. 6) Gaming for wellness will become a regular conversation Before the pandemic, there were lots of memes around gaming addiction and violent games. The pandemic taught us that gaming brought us much-needed social and mental relief. Campaigns such as Play Apart Together helped raise awareness around gaming’s goodness. The annual survey of the Entertainment Software Association found that many see mental health benefits from gaming. About 89% of people say that video games provide stress relief. And 93% believe video games bring joy through play. Another 88% believe they build cognitive skills. Some 57% play to unwind, 46% play to escape, 44% play to use their brains, and so on. Startups in gaming such as Tripp are dedicated to games that focus on meditation. Deepwell DTx is concentrating on games with therapeutic value. Nonprofits like Take This are dedicated to mental well being of game developers, helping them cope with stress and burnout. Streamers have raised the issue as they raise funds for organizations that can help. And conferences like TIGS and Games for Change helped normalize talking about mental health challenges in a public and humane way. It’s a long overdue conversation in gaming, and there are lots of ways gaming can help. 7) Web3 gaming’s comeback chance will depend on a wave of high-quality games from legit game teams Web3 gaming has a poor reputation among gamers in the West, though it is more accepted in Asia. Many saw it as a bunch of scams, cash grabs, complicated technology, and poorly conceived business models. The believers see it as potentially disruptive, giving players ownership of their digital assets and ways to make money such as the resale of used games once offered. Venture capitalists have poured tons of money into the space, with Web3 game investments account for half of all funding in the third quarter of 2022, according to Drake Star Partners. A lot of that money has gone to veteran game developers with triple-A ambitions, and their work hasn’t surfaced yet. Chris Akhavan, chief business officer at Forte, believes Web3 gaming will make its comeback with a wave of high-quality games that will launch in the second half of 2023 and revive the overall space. These titles will come from legit game teams. I believe these teams will have their chance to prove themselves with gamers, especially in Asia, as nearly all South Korean gaming giants are investing heavily in the space, in addition to companies such as Animoca Brands and Square Enix. There are also many strong and well financed startups in the space that have a lot of cash, like Sky Mavis, Sorare, Immutable, Horizon Blockchain Games, Mythical Games, Dapper Labs, Lucid Sight, Gala Games and Double Jump Tokyo. These companies haven’t shown all they can do with their resources yet. While the crypto winter will clear away many of the weak companies and force consolidation, I think the survivors will show us the way. 8) Gaming will become more open There are many forces at play that will make gaming more open. The web browser is poised for a comeback. Companies are working on ways to get around the restrictions of the app stores by turning to the open web. In the past, this meant bad graphics and limited interactivity. But new standards like glTF and proprietary technologies could enable speedier delivery. The open web could be succeeded one day by the open metaverse. That won’t happen real soon, but enough people are talking about this that the conversation is top of mind at some of the biggest and most important companies in the industry. Epic Games raised the issues of openness when it sued Apple and Google for antitrust violations. Without waiting for a final verdict in the appeals for that case in the U.S., the European Union and South Korea have forced changes that big companies like Apple and Google will have to comply with. One byproduct of the U.S. litigation and regulation in the EU is the right for developers to create their own third-party app stores. Yet gatekeepers who create platforms still take a 30% cut of royalties. Matthew Ball, author of the bestselling book The Metaverse , has argued that this stands in the way of progress as it weakens the developers who are in the best position to push forward ideas like the metaverse. While the industry isn’t going to change overnight, the added awareness to the costs of closed platforms is a catalyst for change. Epic isn’t fighting for this all by itself. The Open Metaverse Standards group has formed to push for better open standards, and USD is making progress as an interoperable 3D file format. Forte and Lamina1 have raised a lot of money and they believe that blockchain technology infrastructure can also improve the openness of sectors such as gaming, enabling players to finally own their stuff. Overall, more business models and technologies — like Web3 or cloud gaming or subscriptions — will yield more choice for both developers and consumers. 9) Gaming deals will continue to grow with consolidation in metaverse and Web3 sectors Will the industry continue to consolidate? It’s a good bet that will happen, whether or not the $68.7 billion acquisition of Activision Blizzard by Microsoft closes. The Federal Trade Commission, focusing on the goal of openness and antitrust restraints, is suing to stop that deal. But the economic slowdown will make life hard for a lot of startups and mid-size gaming publishers and developers. In that kind of economy, companies will combine to get bigger and rivals will counter the strategy with expansion plans of their own. Drake Star Partners reports that gaming deals on both the acquisition and startup investment front hit new heights in 2022, after record-breaking years in 2021 and 2020. So this isn’t a hard prediction to make that the trend will continue. And as gaming continues to outperform other sectors, investor money will continue to move into the space. That will provide war chests for companies so that even the biggest companies in the industry will not be immune from acquisition pressures. 10) The realism and imagination of games will astound us This is perhaps the easiest prediction to make. Gaming has always made progress. But as Mike Abrash, CTO of Meta Reality Lab, has said, this isn’t a foregone conclusion. It depends on the brilliance and hard work of game developers making the best possible game they can. With Moore’s Law slowing down and the need for sustainability, it isn’t a given that we’ll always have more computing power to make our games look better. But we haven’t taken full advantage of technology yet. Game engines such as Unreal Engine 5.1 and Unity have been making steady progress, helping interactive entertainment catch up with the visual delights that we seen in movies. Game teams haven’t been able to fully exploit these technologies yet, but they’re hard at work on amazing experiences. And the demand is there because gamers are always seeking something that fulfills our imaginations, whether that comes in the form of better graphics or clever gameplay that we surprises and delights us. Grading my 2022 gaming predictions 1) The war between game devs and platforms will get worse Letter grade: B 2022 notes: Dozens of states joined Epic’s lawsuit against Apple in its appeals court proceedings. That drew more attention to the case. The European Union adopted laws that will force Apple to make changes to its app store in Europe, such as allowing non-Apple stores to be accessible from within iOS. Epic’s lawsuit against Google proceeded in court as well. Game devs complained about Apple’s focus on privacy over targeted ads. But there wasn’t an additional all-out war between developers and platforms. Game platforms became flashpoints in 2021 as Epic Games sued Apple for antitrust violations. Epic decried Apple’s policy of collecting a 30% fee on every in-game transaction in titles like Fortnite. While Epic largely lost most of its case, it did win on one important point that could give alternative payment providers and game developers more hope of capturing the revenues they generate. The courts have stayed that decision so far. But it offered developers hope that they will be at least able to promote lower prices for digital goods on websites that are off the app stores. It underscores the duality of modern platforms, which hold game developers captive yet offer internet browsers that can take players elsewhere. While Yvonne Gonzalez Rogers, the federal judge in the Epic v. Apple case, concluded that current antitrust law doesn’t protect smaller companies as much as it does consumers, she did point out flaws for legislators to address that could curb the power of big tech. On top of that, Epic still has an antitrust suit pending against Google over Google Play Store practices, and that is sure to flare up in 2022. And regulators around the world such as the European Union are investigating the big tech companies and the leverage they hold over developers. Add to this Valve’s decision to ban nonfungible tokens (NFTs) in games on Steam (only to see the Epic Games Store embrace NFT games) and you have more developer anger boiling over. Adding fuel to this developer unrest is Apple’s decision to emphasize privacy over targeted ads. That move also upset game developers, who face lower revenues thanks to the deprecation of the identifier for advertisers (IDFA). As the industry chases the metaverse and blockchain monetization, the rules of engagement for platforms and developers will matter more than ever. Apple has the right to do these things for now, but it can’t afford growing developer resentment. Nor can any big tech platform that wants to benefit from the stickiest applications of all: games. I expect to see more flashpoints over time and movements by game devs to bypass big tech altogether. Both platforms and developers need each other, but they’re still figuring out which side is more powerful. 2) A patchwork metaverse will emerge without interoperability at first Letter grade: C 2022 notes: I can’t say any new bonified metaverse emerged in 2022. But the existing defacto metaverse worlds continued to grow. Those included Roblox, Minecraft, Fortnite and Second Life. The technologies that enable the metaverse — such as standards being developed by the Open Metaverse Forum — started to emerge. There isn’t much interoperability yet between the worlds of Meta’s VR gadgets and those of rivals such as Pico and HTC. And Web3 games have only started to show some interoperability. We did see quite a bit of progress as Nvidia’s Omniverse platform tools took off, with more than 700 enterprises adopting it, using USD as the standard 3D file format. But we knew the metaverse wouldn’t be built in a year. And we are encouraged at the changes we’ve seen happening. The metaverse has to start somewhere. It will likely begin with a patchwork of walled gardens that don’t work together. A lot of people could legitimately argue that this isn’t a metaverse at all. Over time, it will become interoperable with easy transit between worlds, open source standards, and trade agreements. Standards always take a long time to establish, but they eventually happen when enough of the power brokers conclude that working together is better. We’re just not at that stage yet. Right now, everyone who is trying to build a metaverse will attempt to establish themselves as the first mover with the largest audience. Roblox can make a case that its user-generated games platform is the leading candidate for the metaverse, while Epic Games can make a similar claim for its Fortnite game, and Facebook will say its Oculus (renamed Meta) VR platform will win. It’s not a real metaverse until we get that interoperability, of course, but we’ll see islands emerge thanks to the launch of tools such as Epic Games’ Unreal Engine 5 , which is coming in 2022 with a free city that developers can use as a foundation to make metaverse-like games. On the non-gaming side, we’ll also see cool experiences arrive for enterprises in Nvidia’s Omniverse simulation world. In fact, the biggest chance for us to see the real metaverse emerge in the long term could come from the Omniverse, as Nvidia CEO Jensen Huang believes that his company will use the powers of AI and supercomputers to build a digital twin of the Earth for climate change predictions. And once that is build, Huang believes we’ll get the metaverse for free. Some game developers like Brendan Greene , creator of the PUBG battle royale game, really do want to build a digital twin. These, too, will start out as patchwork like BMW’s digital twin factory. Over time, the connective tissue will form — such as NFTs that make it easier to identify digital items that can cross worlds. But it will be like the early days of the internet, like when users on The Well couldn’t talk with those on Compuserve or AOL. At some point, a shared ecosystem or commons will emerge, but probably not in 2022. And maybe not for years. The hardest thing will be for the industry to come together and put selfishness aside in favor of the greater good of establishing open standards for the interoperable metaverse. We’ll see if advocates like Epic’s Tim Sweeney can convince others that coming together is a matter of enlightened self interest. 3) NFT games will go mainstream amid a divided audience of lovers and haters Letter grade: C 2022 notes: We’re still waiting for Web3 games that can capture the imagination of mainstream gamers. Axie Infinity lost some ground and users in 2022 as the crypto winter got a double shock with the bankruptcy of FTX. Gods Unchained has done well but it still has a small base. With a crash in crypto and NFT prices, the core supporters of Web3 games suddenly found they had a lot less money. It was cool to see projects announced like Paul Bettner’s The Wildcard Alliance. His company, Playful Studios, raised $46 million for that game, but it hasn’t launched yet. That’s the case with many top titles in the works. I’m still confident we’ll see the fundings and talent pay off in this space, but we’ll likely see a culling first. Foes of NFTs were gleeful at the backlash that Ubisoft faced when it announced NFTs for Ghost Recon: Breakpoint — a move that gamers roundly criticized. They further reveled in GSC Game World backing off on NFTs for Stalker 2. But I don’t think those foes realize just how much financial might has lined up to make NFT games into a mainstream passion. The true believers in crypto and smart capital are betting that mainstream adoption of NFTs is coming, and they are pouring billions of ideas into the opportunity. This is why we already see so many unicorns created so early in the emerging market among the makers of NFT game infrastructure and platforms. These platforms being created by companies like Forte pledge to make it easy for the best game developers to create mainstream NFT games that take advantage of blockchain technology. These platforms could also simplify the adoption of cryptocurrency through the simplicity of gaming. I’m confident that the innovation will come from blockchain and cryptocurrency and rewards-based business models. I don’t know what that innovation is yet, but when the smartest people in the industry band together to make it happen, I bet that it will happen. I don’t have a stake in this race, but I talk to a lot of people. I have seen this kind of innovation cycle happen before with the derision that free-to-play faced at the outset of mobile games and the ultimate victory it has won with the majority of all games now being free-to-play and mobile. While others scoffed at free-to-play, those that embraced it — like Supercell, Machine Zone, King, and Zynga — won the market. These were mobile-first companies wrecked the premium-price model embraced by incumbents. It is not a foregone conclusion, however, that NFTs will win. The game developers who are integrating them into games will have to win over gamers, who are skeptical, through skillful game design. Skeptics have pointed to problems such as environmental damage from blockchain computing, scams, money laundering, weak games, and profit seekers. But all of these problems can be overcome as the quality developers and companies move into the space. Perhaps the stickiest criticism is that NFTs don’t enable you to do things that you can’t already do in games in some way. I think that this criticism fails to recognize the cleverness of game developers and the value of decentralization — where NFTs can be used to bypass traditional distribution mechanisms and enable peer-to-peer transactions — in cutting out big tech. In this way, NFTs are an arrow in the quiver of independent-minded game developers, much like web games and instant games are. Among the professionals moving into NFT games are Zynga , Mythical Games , Com2Us , Ubisoft , Jam City , Will Wright , Peter Molyneux, Graeme Devine , Austin Grossman, Gabby Dizon, Naomi Augstine-Lee, Chris Clay, Chris Akhavan , and others. Josh Williams, CEO of Forte, which raised $725 million to build NFT game infrastructure, said that all of the major game companies are investigating NFTs. And the NFT game companies raising the most money are the ones that have veteran game developers. Meanwhile, the big companies will get stuck waiting for regulators to say the coast is clear. As Amy Wu of Lightspeed Ventures pointed out, the crypto natives and gaming natives have to come together. When they do, I think that is when we will see mainstream adoption of NFT games and the resistance from gamers may melt away. 4) Game deals will grow so long as the global economy stays healthy Letter grade: A 2022 notes: Drake Star Partners said that in the first nine months of 2022, there were a record 976 deals announced or closed that were valued at $123 billion. That number has already eclipsed the $71 billion for 2021. Of course, a big chunk of that is the $68.7 billion pending acquisition of Activision Blizzard by Microsoft. That deal is still up in the air, as the FTC has challenged it on antitrust grounds. But even though the U.S. venture capital industry has seen a decline in deals in 2022 compared to 2021, gaming deals have been strong, led mostly by investments in Web3 gaming startups. As noted above, game investments hit record levels in 2021, with $71 billion pouring into game startups, acquisitions, and public offerings in the first nine months of the year, according to Drake Star Partners. More than $4 billion went into blockchain games. More than 100 game venture capital funds and dozens of private game unicorns (or startups with valuations above $1 billion) are feeding money into games. Public stock markets have rewarded merger-happy companies like Embracer Group (which has made dozens of acquisitions) and Zynga. This money comes from the top of the food chain, with big investors pouring money into different parts of the ecosystem on the belief that games are benefiting during the pandemic. While that is true, it’s an effect that can wear off. We saw how some companies (Roblox) hit continuous growth targets with each quarter compared to the anomalous quarterly results of 2020 while others (Take-Two, Zynga, Activision Blizzard) barely grew their revenues this year compared to last year. So it’s clear that game companies can’t defy the laws of gravity. If the global stock markets head south, all bets are off. But I don’t really expect that to happen. What is unprecedented at this time is that all parts of the gaming ecosystem are thriving and fueling each other. 5) A new kind of gaming-first transmedia will blossom Letter grade: A 2022 notes: Hollywood knocked it out of the park, making this prediction come true, as we saw Riot Games’ Arcane win four Emmy Awards for the outstanding quality of the Netflix animated series. We didn’t come up with a new word to replace the tainted “transmedia,” but game-based movies and TV shows thrived. Examples include the excellent third season of Ubisoft’s Mythic Quest comedy about a game studio; the second season of The Witcher turned out good, and I was happy to see films such as Sonic The Hedgehog 2 and Uncharted finally hit the screens. I was pleasantly surprised with the quality of Cyberpunk Edgerunners. The Halo series on Paramount+ got mixed reviews but it was renewed for another season. We’ve also got tons more in the works, like The Super Mario Bros. Movie, Minecraft: The Movie, Sonic the Hedgehog 3, BioShock, Death Stranding and so many more. And I am very much looking forward to the January 15 debut of The Last of Us on HBO. One of the conclusions from that last point is that games will become the center of the entertainment universe. And that could mean that movies and TV shows will follow gaming. Gearbox Software’s Randy Pitchford has been touting the opportunity to turn Borderlands into a movie franchise. That’s the opposite direction that Hollywood studios usually pursued when trying to extend entertainment franchises from one media to another. Transmedia became a dirty word because it promised too much in years past. The notion was that properties such as Mickey Mouse could spawn everything from theme parks to video games. But now that games hit the key demographics and have mainstream adoption, extending them into other media makes more sense. Games are now the lead horse. We saw that with Riot Games’ Arcane (based on the League of Legends game) animated television series that became a big hit on Netflix. And we have high hopes for Naughty Dog’s games The Last of Us and Uncharted, which are both being turned into major releases from Hollywood. Microsoft is getting there on its Halo television series. The great hope is that these will come off as compelling films rather than cheesy live-action role-playing (LARP) events. And in the end, streaming subscriptions for the combination of games and movies will make sense. In the end, bits are bits, and Netflix has shown that it is happy to stream either kinds of bits to its audiences. Microsoft would also be happy to offer exclusive game-based movies with its Xbox Game Pass. 6) Game console shortages will continue amid strong demand Letter grade: A 2022 notes: As much as we heard hopes of the supply chain loosening up this year, the next-generation consoles and high-end graphics cards were in short supply for much of 2022. With a crash in crypto mining and a weak economy, many believe that we’re in the midst of a recovery in the supply-demand balance. We’ll see if that holds true for the holiday season and 2023, but let’s hope that gamers still really want these machines that are finally becoming available. I was pleasantly surprised with the quality of Cyberpunk Edgerunners. The Halo series on Paramount+ got mixed reviews but it was renewed for another season. We’ve also got tons more in the works, like The Super Mario Bros. Movie, Minecraft: The Movie, Sonic the Hedgehog 3, BioShock, Death Stranding and so many more. And I am very much looking forward to the January 15 debut of The Last of Us on HBO. Microsoft, Sony, and Nintendo all continued to ship more and more consoles throughout 2021. But it’s hard for them to meet demand because of the voracious appetite for games and the shortage of key semiconductor chips. That shortage is widely believed by companies such as Intel, Nvidia, and Advanced Micro Devices to last into 2023. And that makes it easy to predict that the consoles — which can depend on hundreds of suppliers of thousands of parts per console — are still going to be in short supply in 2022. By now, Nintendo’s Switch should cost a lot less than the introductory price of $299 when it debuted in 2017. But it has topped 100 million sales and still continues to sell well, so Nintendo has no motivation to cut the price. It finally did cut the price on the old Switch in September as it introduced the new Switch OLED model, but nobody really has an incentive to push a price war when we’re still in a pandemic-induced supply shock. If anyone has an opportunity here, it’s the makers of mobile gaming hardware and mobile games, as they can make games more accessible to a wider market. And it’s no surprise that Qualcomm recently introduced a model for a mobile-based game handheld. Now if its manufacturers can get a hold on enough parts to manufacture it, it could exploit the opportunity. 7) Play-and-earn will spread in emerging markets Letter grade: C 2022 notes: Play-and-earn games held their own. Axie Infinity has survived a tough crash in its market and its owner Sky Mavis launched its free-to-play game Axie Infinity: Origin. But the crypto winter and the FTX bankruptcy destroyed a lot of the wealth in this market and that meant it was harder to get new play-and-earn games off the ground. There are a lot of games still coming, but game companies are modifying their models so players are motivated by outstanding gameplay, rather than just the prospect of flipping NFTs and making money. Games are great at motivating players to play because of their intrinsic value. People enjoy them, and they enter a mental state of “flow” when they get really engaged with games. That’s intrinsic value. Extrinsic value is something like getting paid to play games. But the difference between intrinsic and extrinsic value is blurry. We saw the blurriness emerge this year as games like Sky Mavis’ Axie Infinity offered rewards to people who played the game. By investing in unique game items via NFTs, players could acquire unique game characters and make them more valuable through gameplay. They could then resell those characters to other players and make a profit. In the Philippines, hundreds of thousands of players took advantage of this “play-to-earn” game to make more than triple the minimum wage in a country that had 40% unemployment during the pandemic. NFT resales can easily be tracked and credited to either the original creators or the owners themselves. And so players or even the original creators can benefit from item resales. It’s part of a Leisure Economy that is lifting people out of poverty around the world. The small amount of money to be made won’t really appeal to gamers in richer countries, but those gamers might like games that have both intrinsic and extrinsic value. Critics say that Axie Infinity wasn’t inherently fun and it gave players a profit motive rather than pure enjoyment. This extrinsic motivation would eventually wane, the critics said, and the players would give up the game if the ability to make profits went down. But while some praised the life-changing potential of play-to-earn — Sky Mavis said 20% of its players were unbanked — others saw it as just the first inning. Miko Matsumura, cofounder of Gumi Cryptos, believes that NFT-based play-and-earn games — where the game is designed by game veterans to be really fun — that also give players ownership and ability to reap profits will become the prevailing model in gaming. Players, he believes, will see playing games as an investment, just like in the old days when they bought console games and then sold them as used titles to GameStop in years past. Those profits from the used game sales enabled them to reinvest in new games. Players in the West may scoff at this. But those in emerging markets could enjoy earning their digital goods through gameplay and then sell them to labor-averse players in the West. And who doesn’t want to own their own stuff in games they love to play and also make money from it? 8) The metaverse promise will revive VR/AR dreams Letter grade: A 2022 notes: It’s generous of me to give this one an A, as we have seen companies such as Meta struggle to market a profit in VR. But the core fact that supports this prediction is that the virtual reality market grew an estimated 37% in 2022 from 20.2 million VR hardware units sold in 2021 to 27.7 million in 2022. That’s a healthy growth rate, and sales of VR software are expected to show similar growth patterns. That is happening in a year where overall gaming is shrinking 4.3%, according to Newzoo. Augmented reality and virtual reality went through their hype cycles in the past five years. Many game developers gave it a try and then reverted to making traditional games. But the hopes of creating a metaverse to offset the woes brought on by the pandemic have given new hope to those AR/VR dreams. Companies like Niantic , the maker of Pokemon Go, have shown the path for innovations in AR. Niantic has invested heavily in making the leap from location-based games to next-generation AR, which can deliver useful information to you while you’re on the move. And Facebook/Meta continues to double down on Oculus/Meta Quest hardware. In fact, Facebook’s $10-billion-plus-per-year investments make Magic Leap’s $2 billion-plus in funding for its own metaverse ambitions look like chump change — or maybe a couple of months of Facebook’s spending. The belief is that these investments, while still based on nascent markets, will be worth it because AR/VR are the most immersive platforms when it comes to accessing the metaverse. It’s good to see AR/VR startups getting investments again, but we still want to see more hits before this market becomes the reality that we all want to see. In the meantime, on the ground level, I see AR/VR startups getting funded again after a fairly long drought. Of course, the biggest boost of all will come whenever Apple decides to launch its AR/VR product. But predictions about that have been pretty bad so far. 9) Game companies that fail to change will get acquired — or left behind Letter grade: A 2022 notes: It was fortuitous that I used Activision Blizzard as an example here is that the company went through huge changes in 2021 and 2022 with its lawsuit and Call of Duty franchise. It is still defending itself in the case but Call of Duty has seen one of its best years in history as the company reinvented its game engines and aligned its strategy around the core PC/console title, Warzone 2, and mobile. That kind of transformation showed it could change for the better. The jury is out, and it’s not clear if Microsoft will succeed in the acquisition. But 2022 showed that a lot of change is happening at the core of the game industry. When change comes, the losers fall victim to the innovator’s dilemma of sticking to the old cash cows when they should embrace innovations that cannibalize the old. Activision Blizzard is a good case in point. It has reached huge revenues with games like Call of Duty, World of Warcraft, and Candy Crush Saga. But most of its games in the works are sequels or remakes. Where are the original titles? The bar is evidently so high in the company’s R&D ranks that the opportunity cost of investing in older franchises versus new ones is too hard to overcome. Activision Blizzard’s stock price fell dramatically in 2021 as Call of Duty subsided from a 2020 high and it was hit with a sexual harassment lawsuit by California regulators. The latter fact showed that other kinds of change are also necessary for modern game companies to keep up with the times. Failing to recognize when it’s time to change has always been fatal, and that failure often comes from unexpected directions. Activision Blizzard is now a potential acquisition target from possible buyers such as Disney. And nobody expects Activision Blizzard to be a leader when it comes to acquisitions or investments in NFTs, VR, the metaverse or other innovations. It’s worth noting that Roblox, which innovated in a platform for user-generated content, is now the most valuable video game company in the U.S. The results could lead in a variety of directions. Employees may leave the big company for startups. The lesson is a steady-as-she-goes strategy is good until it isn’t, and then change will happen. 10) God of War: Ragnarok and Horizon: Forbidden West will signal Sony’s real next-gen arrival Letter grade: A 2022 notes: Both of these games came in for a landing in 2022 and they showed that Sony’s dedication to single-play narrative blockbusters remains one of its strongest points. Both titles were on my list and GamesBeat’s list for the top titles of the year, and Sony has been greenlighting even more such titles as it strives to compete with its big rivals. I figure if these potential blockbuster games from Sony got delayed in 2021, they should arrive in 2022, right? And I believe both will highlight Sony’s competitive strongpoint of funding games with huge single-player campaigns with strong stories. These games and others like them represent Sony’s unique advantage over Microsoft, which until recent acquisitions didn’t have the giant single-player brands in the same way. Microsoft took a major swing with Halo: Infinite, which is my favorite game of 2021 and represents the best it can produce on the console/PC. But strong narratives are part of Sony’s DNA. If anything should set the PlayStation 5 apart from the Xbox Series X/S, it will be these expensive narrative titles. It takes brave executives to greenlight budgets of hundreds of millions of dollars on projects that take years to complete — even as everyone else focuses on games-as-a-service — to make the hardest core gamers happy. I have the highest expectations for both God of War: Ragnarok and Horizon: Forbidden West. They are carrying a very important torch. I believe they have strong teams and budgets behind them, and this is one of those things that Sony shouldn’t change. 11) Labor will be tight, and labor unions could form Letter grade: A 2022 notes: For half of the year, the labor market was tight. But as the world economy weakened, we saw layoffs affect the game industry. Big companies like Meta and Amazon cut a lot of jobs, and game devs got hit. Still, it wasn’t easy to find the specialized developers that everyone wants. And we did see the first major union in the game industry as Activision Blizzard’s Raven Software QA team voted to form a union, following up on a vote a the small studio Vodeo in 2021. We’ll see where this leads. Just like my No. 9 prediction, this prediction about unions forming seems like a perennial one for games. Game developers have often been exploited and made to work long hours without sufficient pay — known as crunch. Work conditions are sometimes dreadful for diverse workers such as women at companies like Activision Blizzard and Ubisoft. That has always made labor unions appealing, and a survey by the International Game Developers Association (IGDA) in January 2020 showed that 54% of developers favored a game union. Still, the unions have scored only small victories. The pandemic and fresh accusations of bad work conditions at big companies have opened the opportunity for labor unions to make new headway in games. We know that wages are rising amid a huge shortage of skill game developers as the industry enjoys an unprecedented expansion. Crypto game companies are picking off a lot of developers, and VCs are busy funding startups staffed by veterans. The shortage will continue in 2022, and that could once again create conditions for more unionization. These are forces that will help union organizers, but these are forces that the leaders of benevolent and enlightened companies — if they exist — could and should address. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. Games Beat Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,615
2,022
"What is data science? The applications and approaches | VentureBeat"
"https://venturebeat.com/enterprise-analytics/what-is-data-science-the-applications-and-approaches"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages What is data science? The applications and approaches Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Table of contents Data science in the broader sense What is the function of data science in a larger data department? How are some of the major companies approaching data science? How are startups and challengers handling data science? Is there anything that data science can’t do? Data science is the application of scientific techniques and mathematics to making business decisions. More specifically, it has become known for the data mining, machine learning (ML) and artificial intelligence (AI) processes increasingly applied to very large (“big”) and often heterogeneous sets of semi-structured and unstructured datasets. The term was first suggested in the 1970s as a synonym for “computer science” and then in the 1980s as an alternative phrase for “statistics.” Finally, in the 1990s, a consensus began to form as to data science being an interdisciplinary practice that combines data collection, computer processing and analysis. It is seen as “scientific” because it applies systematic analysis to observable, real-world data. Since then it has connoted that full span, from the aggregation of source data to its application in technical and business decision-making and processes. But it has also come to be more narrowly associated with the specialized role and function of “data scientists” in burgeoning data departments who are managing ever more data in the modern enterprise. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Data science in the broader sense In its broader sense, data science can be seen as the application of scientific techniques and mathematics to making business decisions. This work can be broken into three major areas: Gathering: Simply collecting the information from disparate computer systems can be a challenge in itself. The data is often in different formats and it may contain spurious or incomplete records. When the data is cleaned and standardized, it must be stored so the data science algorithms can be used again and again into the future. Analyzing: Looking for patterns, and understanding how the demands upon each stage of the enterprise are changing, requires a mixture of statistical analysis and artificial intelligence. Reporting: Reports can summarize activity, flag anomalous behavior and predict trends and opportunities. Tables, charts, visualizations and animated summaries can tell a story and guide decision-makers. Just as data science is sometimes used in this broader sense, “business intelligence” (BI) and “data analytics” may likewise be more generally applied. Depending on the history, scale and focus of an enterprise’s data department, the department itself, its function and/or its key staffers may be more broadly tasked and/or so titled. However, these terms have different origins and are also most often applied to more narrow functions today. What is the function of data science in a larger data department? Teams of developers, or software engineers, combine with data scientists and data analysts to create tools and solutions designed to optimize collecting data from a wide variety of sources, integrate this data, analyze it and then deliver reports or dashboards for everyone to use to make decisions. Many of these approaches and tools have been given names. Some of the most common are the following: Data warehouse: In a data warehouse the information is stored in a collection of well-ordered tables and structures, often in relational databases. The data is usually well-filtered and sometimes already analyzed. In industries with questions about legal compliance, the data is already checked for anomalies and issues for investigation. Data lake: In a data lake the idea is to gather the information in a central repository, similar to a data warehouse and, indeed, the differences aren’t always clear. In general, data lakes have more raw data that is less filtered or processed. If questions appear, the data is readily available to be examined, but often this work isn’t done unless there’s a demand for the answers. Data store: Data stores tend to be simpler systems that offer more transitory and temporary collections. An example might be all of the data collected by a factory on one day or week. The data is often processed and sent to a lake or warehouse. Data mart: Data marts can offer either internal or external users highly processed data collections for immediate consumption. Inside companies, they may hold official reports that have been checked and certified. Some companies also offer external marts that sell data collections or offer them for free. Predictive analytics : Some use this term to emphasize how data science can help plan for the future with predictions based upon past data. Customer data platform: Some tools are focused on tracking customers to aid in marketing. These often integrate with third-party data sources to build better models of individuals so that marketing efforts can be customized for them. Data as a service: Some companies are specializing in packaging collections of data so that they can be integrated into local data science. Integrated development environments (IDE): These software packages are also used by developers. They collect many of the common tools for analysis, like a Python or R package, and marry them with an editor and file manager so that data scientists can experiment with writing and running new analyses in one place. Notebook: Notebooks are often thought of as dynamic or living documents. They bundle together text, charts, tables and data with the software that produced them. This allows data scientists to share both their results and the analysis that created those results. Readers can not only read the text, they can make changes and explore immediately. Notebook host: Many teams of data scientists dedicate servers to hosting notebooks. These systems store the data and text in the results so they can be read and easily experimented with. Some companies offer hosting as a service. How are some of the major companies approaching data science? The major cloud companies devote substantial resources to helping their customers manage and analyze large datasets that often are measured in petabytes or exabytes. In all of these cases, these major cloud platforms offer more services than can be summarized in a short article. They offer multiple options for both storage and analysis so data scientists can choose the best tools for their jobs. IBM IBM integrates its data storage with a collection of statistical analysis packages and artificial intelligence algorithms. These tools, marketed in a number of forms such as the Cloud Pak for Data , manages access and establishes rules for protecting privacy from the beginning. The tools, which are available both as services and for local installations, can integrate data across multiple clouds and servers. IBM also offers a collection of AI tools and services under its Watson brand that provide algorithms for classifying datasets and searching for signals. Oracle Oracle offers a wide range of databases that can act as the foundation for data lakes and warehouses, either on premises, in Oracle’s cloud data centers or in hybrids of both. Oracle Cloud Infrastructure supports some of the standard data science tools, using R, Python and Matlab, so that the information from these databases can be turned into notebooks, reports or dashboards filled with tables, charts and graphs. The company has also invested heavily in providing pathways for training artificial intelligence models and deploying them into production environments. Oracle is purchasing companies and devoting developers to produce more customized solutions for particular industries with data-intensive needs, like healthcare. Microsoft Microsoft’s Azure cloud offers databases and data storage options such as the Cosmos database, which developers can access via a SQL or NoSQL API. Microsoft’s data science services range from statistical packages to artificial intelligence routines. One option, the Data Science Virtual Machine , allows users to boot up a cloud instance with all of the common packages that are optimized for big data analysis and machine learning projects. Another tool, the Azure Machine Learning Studio , handles most of the details of data storage and analysis so the user can build notebooks that explore the signals in a dataset without worrying about software configuration. Amazon Amazon offers a diverse collection of data storage options, ranging from managed versions of open-source databases like PostgreSQL to cold storage for maintaining copies of archives at a low price. Data scientists can also choose between Amazon Web Services’ (AWS) own products and some from other companies that are hosted in the AWS cloud. Tools like Quicksight , for instance, are designed to simplify creating good, responsive data visualizations that can also adapt as users ask questions. Other products like Kinesis focus on particular data types, like real-time video or website clickstreams. SageMaker supports teams that want to create and deploy artificial intelligence and machine learning to create models with predictive power. Google Google Cloud Platform (GCP) can collect and process large amounts of data using a variety of databases, such as BigQuery, which is optimized for extremely large datasets. Google’s data analysis options include raw tools for creating large data fabrics as well as data analysis studios for exploring. Colab , for example, hosts Jupyter notebooks for data science work that have seamless access to a large collection of GPUs for compute-intensive work. The company has invested heavily in AI and offers a wide range of tools that develop models for extracting insights from data. The VertexAI Workbench , for example, is a Jupyter-based front end that connects to all of the backend AI services available on Google’s cloud. How are startups and challengers handling data science? A variety of companies want to help others understand the wisdom that may be hidden inside their data. Some are building platforms for storing and analyzing data. Others are just creating tools that can be installed on local machines. Some offer a service that may be measured by the byte. At the core of many of these products and services are open-source software packages like R or Python, the common languages used by data scientists. There are also several good open-source packages that offer an integrated data analysis environment. Software like RStudiio , Eric and Eclipse are just a few examples of tools that deliver a comfortable environment for exploring data. JetBrains sells PyCharm, an integrated development environment for creating Python applications. Many programmers work on Python-based data science there. The company also distributes a free community edition that is popular with many schools. Snowflake makes a cloud-based data storage platform with a wide range of features including cybersecurity , collaboration and governance control. There are many uses for this data lake or data warehouse service; supporting machine learning and data science is one of the most popular. Snowflake’s cloud supports many common applications and can run many Python applications on the data stored in its cloud. Kaggle is a data science platform that offers both storage and analysis, both for private datasets and for many of the public ones from sources like governments and universities. The data science is often done with notebook-based code that runs locally in Kaggle’s cloud using either standard hardware or specialized Graphic Processing Units or Tensor Processing Units. The company also sponsors data science contests which some companies use to tap into the creativity and wisdom of the open community. The Databricks Lakehouse Platform supports storing and analyzing data either in its cloud, in many of the big clouds, or on premises. The tool helps orchestrate complex workflows that collect data from multiple sources, integrating it and then generating charts, graphs, tables and other reports. Many of the most common data science routines are easily applied as steps in these workflows. The goal is to provide a powerful data collection and storage platform that also produces good data science in the process. Is there anything that data science can’t do? Questions about the limitations of science have been a deep and often philosophical question for scientists over the years. The same questions about the power and precision of mathematical tools are important for users who want to understand how businesses and other organizations function. The limits of statistical analysis and machine learning apply just as readily to data science work. In many cases, the problems aren’t with the mathematics or the algorithms. Simply gathering good-quality data is a challenge. Analysis can’t really begin to be trustworthy until data scientists ensure that their data is reliable and consistent. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,616
2,022
"Top 5 data stories of 2022: Teradata vs Snowflake, SQLite, privacy in Metaverse | VentureBeat"
"https://venturebeat.com/data-infrastructure/top-5-data-stories-of-2022"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Top 5 data stories of 2022: Teradata vs Snowflake, SQLite, privacy in Metaverse Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. 2022 was an eventful year for the data industry. As global enterprises became laser-focused on making the most of their data assets, companies providing the infrastructure for the data stack took crucial steps to build out their offerings. Snowflake and Databricks debuted multiple industry-specific offerings as well as whole new sets of features, while Teradata made a move to take on the two. Questions about privacy in the metaverse also drew attention. Here are VentureBeat’s top five data stories of 2022: Teradata takes on Snowflake and Databricks with cloud-native platform In August, Teradata debuted two new offerings – VantageCloud Lake and ClearScape Analytics – to better take on competition Snowflake and Databricks. The VantageCloud Lake, as the company explained, extends its Vantage data lake to a more elastic cloud model while ClearScape helps enterprises take advantage of new analytics, machine learning and artificial intelligence (AI) development workloads in the cloud. The combination of the two promises to streamline data science workflows, support ModelOps and improve reuse from within a single platform. The offering also takes advantage of Teradata’s various R&D into smart scaling, allowing users to scale based on actual resource utilization rather than simple static metrics. It also promises a lower total cost of ownership and direct support for more kinds of analytics processing. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! “ Snowflake and Databricks are no longer the only answer for smaller data and analytics workloads, especially in larger organizations where shadow systems are a significant and growing issue, and scale may play into workloads management concerns,” Hillary Ashton, chief product officer at Teradata, said. Why SQLite may become foundational for digital progress Open-source database engine SQLite made headlines as multiple companies, including Cloudfare and Fly, announced they are building projects around it. The engine is 20-years-old and written in plain, old C but is seen as a foundation to digital progress. According to developers, adding SQLite helps them provide more sophisticated applications. Data can be stored locally on an edge node and then eventually replicated throughout the world. Plus, its basic, single-threaded system often comes in handy in small projects. “It’s really nice during development,” said Kent Dodds, a developer who frequently deploys SQLite in projects. “[There’s] no need to get a database server up and running (or a docker container). It’s just a file. You can even send the database file to a coworker if you need some help with something.” Having that said, even as adoption increases, there are still plenty of wrinkles around SQLite (like the lack of supporting tooling) that will need to be addressed. Metaverse vs. data privacy: A clash of the titans? Metaverse (as we understand it now) is the next phase of the internet, but what happens to data privacy when it comes to the fore? It may well be another “clash of the titans.” The metaverse wants to harvest new, uncharted personal information, even to the point of noting and analyzing where your eyes go on a screen and how long you gaze at certain products. Data privacy, on the other hand, wants to protect consumers from this incessant cherry-picking. One can bet that in the new online economy of the future, plenty of new startups will be lining up on both sides. 22 open-source datasets to boost AI modeling While data is the new oil, information gathered by the company alone may not always offer the variety needed for an AI/ML project. To address this, individuals, companies and governments share open-source datasets that are free to use and build upon. Their reason for sharing the data may differ from case to case, but the information might be what your project needs. VB’s list of top open-source datasets includes OpenStreet Map, Kaggle, U.S. census, Data.gov, Data.Europa.Eu, Data.Gov.UK, PLOS and Open Science. Snowflake’s industry-focused data offerings Finally, Snowflake announced industry-specific versions of its data cloud to serve retail , healthcare and life sciences industries. The move came in response to Databricks’ industry offerings and with the backing of various technology, data, application and consulting partners, including Equifax, Dataiku, H20.ai, Cognizant, Deloitte and Strata. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,617
2,022
"Stablecoins could be the future of ecommerce digital payments | VentureBeat"
"https://venturebeat.com/data-infrastructure/stablecoins-could-be-the-future-of-ecommerce-digital-payments"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Stablecoins could be the future of ecommerce digital payments Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Over the past decade, payment companies and ecommerce have revolutionized how individuals shop online. I do nearly all my shopping online, along with 263 million other Americans. For me, it’s the most convenient way to shop and sort through what I’m looking for without spending hours combing through racks of products. While the experience of browsing online works generally well, when it comes time to check out, there are so many opportunities for technology to further enhance the experience for both consumers and merchants by integrating blockchain and digital asset payments. Dollar digital assets: The building blocks for programmable money Blockchain is a digital, decentralized, immutable record of transactions that utilizes smart-contract technology to create seamless, secure and automatic transactions directly between buyers and sellers. This technology enables the use of cryptocurrency , a digital asset and medium of exchange that uses cryptography for secure transactions. Some luxury brands, including Gucci , Off-White and Balenciaga , are already experimenting with digital assets as a form of payment. However, digital assets like Bitcoin and Ethereum are subject to price volatility, making it risky for buyers to use them for payment, given the change in value over time. It can be difficult to cut through the hype around these companies adopting crypto payments. When it comes down to it, the transformative payments formula lies within dollar digital assets, or stablecoins. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Dollar digital assets provide the building blocks for programmable money, enabling a world where value transfer looks like the internet, with the global, free exchange of value. Trusted stablecoins are essentially a digital version of a dollar — fully reserved by physical U.S. dollars in bank accounts, redeemable 1:1 for cash and available for use on blockchains. Dollar digital assets are in a unique position to power the revolution of digital asset payments, with the high-speed, low-cost, always-on perks but without the price volatility. In a not-too-distant future, stablecoins can become the primary means for both online and in-person transactions. According to a June 2022 report from Deloitte , nearly 75% of surveyed merchants plan to accept digital asset payments within the next two years, and 83% expect consumer interest in digital assets to increase over the next year. Shopping with stablecoins The benefits are clear for merchants and consumers alike. For merchants big and small, accepting stablecoins from customers cuts transaction costs, eliminates middlemen and onboards new customers, who only need funds in a digital wallet to make purchases. Settling merchant payouts with stablecoins significantly improves payment flow, making it faster and cheaper. Like email communication, transacting with dollar digital assets is “always on,” so settlements and payouts can happen on weekends and holidays without any delays. For consumers, getting up to type your credit or debit card number into a website or risking privacy to save the information will be an online shopping inconvenience of the past. With digital assets, users seamlessly connect their wallets to their browsers and can pay instantly with the available funds. Integrating digital asset payment options will make purchases a seamless extension of the online shopping experience. Even though I’m an avid online shopper, I’m not the best, so what excites me the most is the near-instant speed of transactions that will cut down on processing times for my returns. While the adoption of stablecoins has rapidly increased within the crypto ecosystem for decentralized finance and capital markets, we’re still in the early stages of using dollar digital assets for everyday purchases. Hurdles to overcome before we see widespread merchant adoption include the need for clear regulation, increased awareness of the benefits of using stablecoins for payments, and most importantly, a more intuitive user experience. As with all disruptive technologies, this will take time. Gradually, we will see more online storefronts integrate digital wallet options so customers can pay with stablecoins, leading us to a point where the experience is so seamless that we won’t think of it as crypto but as internet-native shopping. Rachel Busch is communications manager at Circle , a global finance technology firm and operator of stablecoins USD Coin (USDC) and Euro Coin (EUROC). DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,618
2,022
"Report: Data silos cause employees to lose 12 hours a week chasing data | VentureBeat"
"https://venturebeat.com/data-infrastructure/report-data-silos-cause-employees-to-lose-12-hours-a-week-chasing-data"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: Data silos cause employees to lose 12 hours a week chasing data Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Airtable’s new commissioned study by Forrester Consulting reveals that large organization business processes are more fractured than they think — leading to poor decision-making, more errors and weaker team morale and revenue. Amid our increasingly digital world, the proliferation of enterprise software apps and tools is leaving employees overwhelmed. Airtable and Forrester’s Crisis of a Fractured Organization survey reveals that large organizations today use 367 software apps and systems on average to manage their various workflows. These varying tools create data silos that hide critical data, encourage multiple sources of truth, and make it harder for teams to find the information they need. With all the moving workstreams, teams are feeling out of sync with each other. A majority (79%) of knowledge workers reported that teams throughout their organizations are siloed, and 68% said their work is negatively impacted because they don’t have visibility into cross-functional projects. Critical insights hidden Teams are making poor or slow decisions based on the limited information in front of them, unaware of critical insights hiding in different tools. Nearly half of respondents (46%) said poor business processes result in decisions taking longer and a higher risk of making wrong decisions. These fractures hurt revenue and leave employees frustrated and disengaged. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The survey uncovered that the fractures within organizations negatively affect employees’ work. Knowledge workers reported spending nearly 29% of their week (11.6 hrs) searching for the key information they need to do their work. This struggle to find the right information is the No. 1 reason employees say they’re feeling disengaged. However, organizations understand the critical need for a future fit strategy and foster greater connectivity across data and teams. The majority (93%) of survey respondents said increasing process efficiency in their organization is a high priority; similarly, 90% noted improved collaboration as a top priority. By implementing connected tools to increase cross-functional collaboration and organizational alignment, respondents expect to save nearly 12 hours each week and will be able to devote that time to focus on their core work. Airtable’s commissioned study by Forrester surveyed 1,022 individual contributors and decision-makers up into the C-suite in the U.S. and the UK. The study was begun in August 2022 and concluded in September. Read the full report by Airtable and Forrester. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,619
2,022
"Report: Cloud revenue growth far outpaces on-premises | VentureBeat"
"https://venturebeat.com/data-infrastructure/report-cloud-revenue-growth-far-outpaces-on-premises"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: Cloud revenue growth far outpaces on-premises Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Battery Ventures’ State of the OpenCloud 2022 report examines the landscape of cloud infrastructure and open-source software and offers operational best practices for cloud companies. Cloud giants like Amazon Web Services (AWS), Microsoft Azure and Google Cloud have demonstrated significant growth durability — and profitability — at massive scale, despite the current macroeconomic environment. Billion-dollar businesses like Databricks (a Battery portfolio company) and Snowflake are built on top of these massive cloud vendors, and there is significant room to grow, as digital transformation accelerates across industries and more workloads shift to the cloud. Heavy cloud growth Across four major infrastructure-software companies (MongoDB, Confluent, Elastic and Splunk), cloud revenue CAGR was at least 72% and as much as 200% from 2018 to 2022. On-premises CAGR, by contrast, ranged from 4% to 55%. Overall, Battery predicts cloud spend currently represents about 25% of the $919 billion overall enterprise IT market spend in 2022, a proportion that will no doubt increase in the months and years to come given the potential for market disruption. Amid increasing pressure to improve productivity and efficiency and a high-inflation environment, Battery predicts we will see a step-change increase in cloud adoption across every industry and economy. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Many cloud companies are demonstrating robust growth and even profitability as the broader IT market remains relatively under-penetrated by cloud; there’s still plenty of room to grow,” said Danel Dayan, Battery vice president and one of the report’s authors. “While it remains to be seen what will happen in the next few months in the technology industry, given the market downturn, we are confident that the cloud, and open-source software, will soon transform industries in ways we have never seen before,” Dayan added. Methodology The Battery Ventures 2022 OpenCloud report relies upon publicly available data from Gartner, Pitchbook, the U.S. Bureau of Labor Statistics, CapIQ and company earnings reports. (Battery’s investments and exits are listed here. ) Read the full report from Battery Ventures. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,620
2,022
"Multicloud isn’t working: Bring on the supercloud! | VentureBeat"
"https://venturebeat.com/data-infrastructure/multicloud-isnt-working-bring-on-the-supercloud"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Multicloud isn’t working: Bring on the supercloud! Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Say the word “ multicloud ” to CIOs or other tech leaders, and you’re almost guaranteed to get an eyeroll. The prospect of simultaneously maintaining on-premises data centers, migrating to one public cloud, and then redundantly staffing for two or more additional clouds on top of that is a recipe for frustration and disappointment. There are good reasons to want multicloud — avoiding vendor lock-in, access to best-of-breed cloud services, and the flexibility to integrate with business partners (among others). But despite these potential benefits, most businesses will not achieve a viable multicloud strategy anytime soon, for the simple reason that it’s just too costly and complex for most businesses. If ever there were ever a need for an innovative alternative from the market, now is the time. Enter Supercloud. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Supercloud is what comes after public cloud: It’s a cloud-spanning value-add layer that hides complexity, exposes differentiated service capabilities, and adds new business (and business network) leverage beyond the raw “Lego bricks” of existing public and private clouds. Supercloud represents the natural democratization of the current “wall of complexity” facing both developers and businesses. As a technical or business leader, how can you make the most of the rise of this new supercloud category of vendors? What best practices can be followed today, when supercloud is still an emerging concept? Here are some basic guidelines for thinking about cloud investments during the era where both cloud-specific infrastructure and supercloud vendors will coexist as critical elements of an IT portfolio. Select supercloud vendors who already “get it” The simplest and most highly leveraged way to incorporate change is to select a vendor who can successfully encapsulate it. Unlike hiring an SI for public cloud migration, this strategy aligns well with supercloud’s emergence because multicloud is best delivered as a product feature. Moreover, the SaaS delivery and packaging approach of supercloud means that the majority of improvements made by vendors will transparently flow through to their users without the expensive and challenging upgrades of the past. In other words, as supercloud enabled products get better, so do the businesses that use them. Avoid “accidental multicloud” Multicloud strategies and investments are among the most costly and complex in a CIO’s portfolio. Leaving them to chance (or differing internal opinion) is perilous, as it can result in costly migrations down the road. Foundational elements, like how analytic and operational data are handled, should be viewed through multiple lenses, including how to avoid redundant spending and staffing. Early and intentional supercloud vendor selection can focus those decision processes, reducing complexity and spending over time. Design and invest for network effects, not for individual applications One of the most important emerging best practices is to shift focus from the micro (each application owned or operated by IT) to the macro (the value of automating and digitizing business networks and their associated workflows). This new era of inherently distributed and decentralized applications demands even more from the “IT plumbing” on which it runs. It also means that decisions about multicloud and cross-vendor technology capabilities have become a mission-critical, CIO-level decision. Supercloud may be another tidal wave for technology managers to surf, but it’s one that’s coming our way. Businesses that ride it successfully will lower their cloud costs , improve time to market, and reap competitive advantages in this next wave of cloud adoption. Tim Wagner is cofounder and CEO of Vendia DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,621
2,022
"Korean Born2Global is leading a new era of global joint venture partnerships | VentureBeat"
"https://venturebeat.com/business/korean-born2global-is-leading-a-new-era-of-global-joint-venture-partnerships"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Korean Born2Global is leading a new era of global joint venture partnerships Share on Facebook Share on X Share on LinkedIn Presented by Born2Global Even before the global economic shocks of COVID-19, the global investment landscape for startups was experiencing a groundbreaking shift. 2019 saw more investment from major players like SoftBank and Sequoia Capital into rising markets including China, Southeast Asia and Latin America. Investors are constantly in search of the next major market disruptor, and while global investment trends change in response to market conditions, support for startups to grow and expand remains robust. Korea is no exception, with government investment opportunities for startups continuing to expand and diversify. The Korean government allocated $1.4 billion USD in 2021 for startup support projects, fostering the development and management of more than 194 programs for SMEs and startups. Thanks to a robust policy environment supporting the development of emerging Digital Transformation technologies, Korea’s Ministry of Science and ICT (MSIT) is also implementing startup support budgetary programs. As part of the shift towards globalizing Korea’s startups, the MSIT introduced a new program to support deep tech-based companies for their global market entry in the form of binational joint ventures (JV) in 2021. Technology joint venture partnerships — creating a win-win global innovation ecosystem Why take Korea’s tech startups overseas through binational JV partnerships? Even as the recent COVID crisis exposed gaps in the global supply chain, it highlighted the delicate interdependence of global markets and emphasized the importance of international cooperation. JV partnerships have been gaining attention and popularity in recognition of the robust benefits that they can offer to both businesses and local economies. Joint ventures can provide innovative startups with additional resources, expanded capacity, enhanced technical expertise and access to established markets and distribution channels that they would have a difficult time breaking into otherwise. Korean companies that have succeeded in JV overseas partnerships have been able to enter markets where they need technology or products with limited resources. The potential for economic benefits resulting from technology transfer and local business growth have led to growing interest and collaboration between governments, institutions and ecosystem stakeholders in these JV-supported projects. In other words, the JV overseas startup expansion model fosters a win-win sustainable technology innovation ecosystem. To foster a virtuous startup ecosystem and fill market gaps both in Korea and abroad, the Korean government has continued to expand its support for JV partnerships between Korean and overseas startups. Opportunities for investment, partnership and expansion into the Latin America and Caribbean (LAC) region for Korean startups have recently attracted particular attention thanks to the success of a joint program between the Inter-American Development Bank (IDB) Lab and Korea’s Born2Global Centre to create and support binational JV startup partnerships. The success of the JV overseas startup expansion model between Korean companies and startups in this region over the past several years was so notable that it was cited when JV overseas startup expansion was specifically named in the Korean government’s 2021 economic policy and incorporated into the policies of the latter half of that year. While the KOR-LAC JV partnership prototypes faced challenges including time, language and culture differences, the five pairs that were selected have thus far enjoyed impressive success. For example, the Korean startup Coconut Silo, a logistic freight transportation company aimed at connecting drivers directly with suppliers using AI, partnered with Avancargo, an Argentinian B2B platform for freight logistics. The two companies integrated their APIs and technology to expand quickly, and anticipate that their innovative algorithm will help reduce greenhouse gas emissions from cargo shipping. Overall, the success of the LAC Deep Tech Exchange Program thus far has highlighted not only the possibilities of successful startup overseas expansion through JVs, but also the essential role played by the companies that perform matching and support services for these startups. As a Korean company, Born2Global Centre has played a pivotal role in making this innovative program possible. So, what is Born2Global Centre, and why has it become the primary entity to assist Korean tech startups with overseas JV matchmaking? Born2Global Centre makes jumping into a JV overseas expansion partnership easy and seamless Founded in 2013, Born2Global Centre focuses on facilitating deep-tech partnerships between Korean startups and overseas partners and has a long history of mediating successful JV matches, especially in D.N.A (Data, Network, AI) technologies and relevant industrial domains. In addition to its valuable contributions to the LAC Deep Tech Exchange Program, Born2Global has facilitated successful JVs between companies from Korea and companies from China, the United States, Vietnam, Turkey, the United Kingdom, Bolivia and more. The Centre offers an annual, year-long program for over 200 startups with several sources of support. Each startup that participates in the binational JV support program receives services (business management consulting, PMF consulting, etc.), a maximum of $80,000 USD in funding, and assistance finding and verifying appropriate overseas partners, as well as help navigating laws, accounting and taxation issues, applying for intellectual property protection, and creating a local marketing strategy. Established JVs can receive help with local settlement and growth, including support in the areas provided for nascent startups as well as business development and investment attraction support. In other words, working with Born2Global also offers startups access to essential resources once they connect with an overseas partner for a JV. Since its establishment, Born2Global has helped more than 2,962 startups. It has successfully facilitated the overseas incorporation of 112 startups, filed 996 global patent and trademark applications and attracted over 2.98 billion dollars of domestic and global investment for member startups. Born2Global has ushered two of its members, Sendbird and Riiid, along the path to unicorn status. It is currently nurturing another ten potential unicorns, including the Korean companies GreenLabs , Dable , Qraft and Ssenstone. Born2Global’s proven track record of success makes it an ideal partner for startups seeking a JV partnership with an overseas company. Its support model has been successful time and time again. Born2Global is always open to foreign companies interested in partnering with innovative, Korean startups as well as public-private partners who want to increase the quality and number of local tech businesses in their community. If you or your organization is interested in pursuing an exciting new partnership, contact Born2Global Centre today to learn more about current and future opportunities. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,622
2,022
"CORRECTING and REPLACING Former Apple CEO John Sculley Bets on eternalHealth, a Medicare Health Plan in Boston, and Joins as the Chairman of the Board | VentureBeat"
"https://venturebeat.com/business/former-apple-ceo-john-sculley-bets-on-eternalhealth-a-medicare-health-plan-in-boston-and-joins-as-the-chairman-of-the-board"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release CORRECTING and REPLACING Former Apple CEO John Sculley Bets on eternalHealth, a Medicare Health Plan in Boston, and Joins as the Chairman of the Board Share on Facebook Share on X Share on LinkedIn BOSTON–(BUSINESS WIRE)–December 30, 2022– Please replace the release dated December 29, 2022 with the following corrected version due to multiple revisions to the second paragraph. The release reads: FORMER APPLE CEO JOHN SCULLEY BETS ON ETERNALHEALTH, A MEDICARE HEALTH PLAN IN BOSTON, AND JOINS AS THE CHAIRMAN OF THE BOARD eternalHealth, the first Medicare Advantage Health Plan to get licensed in Massachusetts in nearly a decade, is looking to disrupt the health insurance industry by providing high quality, affordable care to Beneficiaries in Massachusetts and eventually beyond. The company also announced that former Apple CEO John Sculley has joined the firm as Chairman of its Board of Directors. “Led by eternalHealth Founder and CEO Pooja Ika, a 25-year-old Babson graduate, the company has a simple mission of leverage technology to reduce health plan operating costs, allowing more dollars to be allocated towards patient care and benefits while embracing value-based care. This in turn will make healthcare more accountable, accessible, and comprehensive,” said Mr. Sculley, who as of December 1 st joined as Chairman of the eternalHealth board. “I reinvented myself in healthcare after my time at Apple two decades ago, by investing and mentoring dozens of entrepreneurs within the healthcare industry; working with software, biotech, analytics, and services companies,” said Mr. Sculley. “When Pooja shared the vision behind eternalHealth and articulated that her beliefs were similar to mine, I knew that this was a company I was excited to be a part of. I look forward to working with Pooja, management, and a knowledgeable board to help grow the company.” Mr. Sculley said that Americans spend $4 trillion on healthcare and a trillion of those dollars are wasted every year. “All Payers (Commercial payers, State MEDICAID, and MEDICARE) pretty much manage the $4 trillion healthcare spend by deciding how care is delivered, measured, and paid for,” said Mr. Sculley. “I believe if payers deploy digital workplace automation for most of the transactions across their organization, they can substantially reduce administrative costs and leave more dollars for the total cost of care, allocating more towards providing quality care through a high-touch model to assist patients and physician partners. Secondly, there is so much potential to take value-based care to the next level through platform driven intelligence into physicians’ and hospitals’ existing workflows and empowering every care touch point with 360-degree view of the patients care history, creating opportunities to improve quality and reduce costs.” eternalHealth strives to provide Medicare Beneficiaries with high-quality, affordable benefits that are comprehensive. By leveraging nirvanaHealth’s cloud-based technology platform, one that embraces artificial intelligence and robotic process automation, the organization can eliminate the cost and need to work with multiple different vendors and is able to automate close to 2,000-3,000 different transactional functions to operate as a compliant payer. These efficiencies allow eternalHealth to focus on the things that truly matter, such as working on providing enriched benefits at a lower cost to its members and removing the mistrust that exists between providers and payers by allocating more dollars towards the total cost of care and provider incentives. Additionally, the company will place a large emphasis on the member experience by rolling our various member focused initiatives to increase member satisfaction, retention, and help build a sense of community and belonging. Ika states, “I could not be more honored to have someone like John chair our Board. He is a great addition to an already incredibly knowledgeable and strong board we have been fortunate to build. I am so excited to be able to tap into John’s decades of experience in consumerism. Medicare Marketing is extremely consumer focused and I look forward to bringing a fresh, new perspective to health insurance, one that is focused on John and I’s shared vision of bringing luxury to healthcare, without the luxury costs.” About eternalHealth: Headquartered in Boston, eternalHealth provides high-quality care with low out-of-pocket costs to the residents of Massachusetts. eternalHealth is committed to placing the member at the forefront of every decision they make as an organization. Women-run, built, and owned, eternalHealth is a Medicare Advantage health plan that offers HMO and PPO products in three Massachusetts counties: Worcester, Middlesex, and Suffolk. For more information about eternalHealth’s plans and services, please visit www.eternalHealth.com. View source version on businesswire.com: https://www.businesswire.com/news/home/20221229005188/en/ Upendra Mishra The Mishra Group [email protected] Tel: 617-312-4395 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,623
2,022
"What 10 top AI stories in 2022 reveal about 2023 | VentureBeat"
"https://venturebeat.com/ai/what-10-top-ai-stories-in-2022-reveal-about-2023"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages What 10 top AI stories in 2022 reveal about 2023 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. As we look back at VentureBeat’s top AI stories of the year, it’s clear that the industry’s advances — including, notably, in generative AI — are vast and powerful, but only the beginning of what is to come. For example, OpenAI, the artificial intelligence research lab behind AI tools that exploded this year, including DALL-E 2 and ChatGPT, debuted buzzed-about advancements that drew attention from the general public as well as the tech industry. DALL-E’s text-to-image generation and ChatGPT ‘s new capabilities to produce high-quality, long-form content made creatives question whether they will soon be out of a job — and who owns the content these tools are creating anyway? Meanwhile, the next iteration of advancements may not be far off for OpenAI. This fall, Ray, the machine learning technology behind OpenAI’s large-scale operations, debuted its next milestone: Ray 2.0. The update will operate as a runtime layer and is designed to simplify the building and management of large AI workloads, which will allow companies like OpenAI to make even greater strides in 2023. Though generative AI led much of this year’s trending coverage, it wasn’t the only area of AI where waves were made that had a ripple effect. Intel unveiled what it claims is the first real-time deepfake detector, which works by analyzing subtle “blood flow” in videos and produces results in minutes that are 96% accurate. It’s a tool that may become increasingly useful to maintain integrity as generative AI video and image capabilities become even more realistic. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! And AI continued to seemingly “eat” the world as we know it, even the most mundane technology use cases as the most complex algorithms were reoriented with AI-powered improvements this year. Google released a beta version of Simple ML for its Google Sheets tool to revamp the platform’s capabilities for calculations and graphing, while DeepMind unveiled its first AI to power faster matrix multiplication algorithms, which some say may be used to improve the entire computer science industry. Along with the strides made in AI this year, several companies are heading into 2023 with fewer AI employees due to layoffs as a result of the declining economy, including Meta. As part of its 11,000 layoffs, the technology and social media giant laid off an entire machine learning infrastructure team this fall — which came as a surprise, given the company said it plans to increase its focus on AI. While the future may be uncertain for some AI professionals in the short term, experts don’t anticipate that this will significantly impact AI’s progress in the long run. There have been arguments that AI has in some respects hit a wall, or slowed down to what one industry CEO referred to as a “ Stone Age. ” Others have fired back against claims like these, including renowned computer scientist and artificial neural networks pioneer Geoffrey Hinton , who told VentureBeat that the rapid progress we’re seeing in AI will continue to accelerate. Looking ahead, Andrew Ng , founder of Landing AI and DeepLearning AI, told VentureBeat that the next decade of progress in AI will revolve heavily around its generative AI capabilities and shift toward data-centric AI. “As we collectively make progress on this over the next few years, I think it will enable many more AI applications, and I’m very excited about that,” Ng said in a previous interview. Progress is certain to continue, but not without bumps in the road. As legislation around regulating AI continues to unfold, it will be important for organizations to hire executives — perhaps a chief AI officer — who are knowledgeable about its benefits, consequences and constantly evolving capabilities. Until then, progress, not perfection, is what to expect for 2023. Here’s more from our top 10 AI stories of 2022: Andrew Ng predicts the next 10 years in AI George Anandiotis wrote this March 21 story, an interview with Andrew Ng, founder of Landing AI and DeepLearning AI, co-chairman and co-founder of Coursera and adjunct professor at Stanford University. Ng told VentureBeat that much of the focus on AI throughout the last decade has been on big data. In decades to come, he predicts a shift toward data-centric AI. “Ten years ago, I underestimated the amount of work that would be needed to flesh out deep learning, and I think a lot of people today are underestimating the amount of work … that will be needed to flesh out data-centric AI to its full potential,” Ng said. “But as we collectively make progress on this over the next few years, I think it will enable many more AI applications, and I’m very excited about that.” Meta layoffs hit an entire ML research team focused on infrastructure Senior writer Sharon Goldman was up late at night scrolling through Twitter on November 9, the day Meta announced it was laying off 1,000 employees. In a public statement, Mark Zuckerberg had shared a message to Meta employees that signaled, to some, that those working in artificial intelligence (AI) and machine learning (ML) might be spared the brunt of the cuts. However, Thomas Ahle, a Meta research scientist who was laid off, tweeted that he and the entire research organization called Probability, which focused on applying machine learning across the infrastructure stack, was cut. The team had 50 members, not including managers, he said. OpenAI debuts ChatGPT and GPT-3.5 series as GPT-4 rumors fly As GPT-4 rumors continued to fly at NeurIPS 2022 on November 30, OpenAI managed to take over the news with ChatGPT, a new model in the GPT-3 family of AI-powered large language models (LLMs) that reportedly improves on its predecessors by handling more complex instructions and producing higher-quality, longer-form content. ChatGPT has been out for only a few weeks, but hasn’t stopped making news since its release. DeepMind unveils first AI to discover faster matrix multiplication algorithms It was considered one of the toughest mathematical puzzles to crack: Could AI create its own algorithms to speed up matrix multiplication, one of machine learning’s most fundamental tasks? In a paper published in Nature on October 5, research lab DeepMind unveiled AlphaTensor, the “first artificial intelligence system for discovering novel, efficient and provably correct algorithms.” The Google-owned lab said the research “sheds light” on a 50-year-old open question in mathematics about finding the fastest way to multiply two matrices. AlphaTensor, according to a DeepMind blog post, builds upon AlphaZero, an agent that has shown superhuman performance in board games like chess and Go. This new work takes the AlphaZero journey further, moving from playing games to tackling unsolved mathematical problems. Google brings machine learning to online spreadsheets with Simple ML for Sheets On December 7, Sean Michael Kerner shared the news that Google was planning to bring machine learning to its Sheets tool. While simple calculations and graphs have long been part of the spreadsheet experience, machine learning (ML) has not. ML is often seen as being too complex to use, while spreadsheets are intended to be accessible to any type of user. Google announced a beta release of the Simple ML for Sheets add-on. Google Sheets has an extensible architecture that enables users to benefit from add-ons that extend the application’s default functionality. In this case, Google Sheets benefits from ML technology that Google first developed in the open-source TensorFlow project. With Simple ML for Sheets, users will not need to use a specific TensorFlow service, as Google has developed the service to be as easily accessible as possible. 10 years later, deep learning ‘revolution’ rages on, say AI pioneers Hinton, LeCun and Li When senior writer Sharon Goldman realized that September 2022 was the 10-year anniversary of key neural network research — known as AlexNet — that led to the deep learning revolution in 2012, she reached out to AI pioneer Geoffrey Hinton. With interviews with Hinton and other leading AI luminaries including Yann LeCun and Fei-Fei Li, this piece is a look back at a booming AI decade, as well as a deep dive into what’s ahead in AI. Will OpenAI’s DALL-E 2 kill creative careers? OpenAI ‘s expanded beta access to DALL-E 2, its powerful image-generating AI solution, sent the tech world buzzing with excitement in late July, but also left many with questions. For one thing, what does the commercial use of DALL-E’s AI-powered imagery mean for creative industries and workers? Will it replace them? According to OpenAI, the answer is no. DALL-E is a tool that “enhances and extends the creative process,” an OpenAI spokesperson told VentureBeat. Much as an artist would look at different artworks for inspiration, DALL-E can help an artist come up with creative concepts. Since this article was published, debate and criticism has continued about the ownership of images generated by AI. It certainly won’t end anytime soon. Intel unveils real-time deepfake detector, claims 96% accuracy rate On November 16, Intel introduced FakeCatcher, which it says is the first real-time detector of deepfakes — that is, synthetic media in which a person in an existing image or video is replaced with someone else’s likeness. Intel claims the product has a 96% accuracy rate and works by analyzing the subtle “blood flow” in video pixels to return results in milliseconds. With deepfake threats looming, this type of deepfake detection technology is becoming ever more important. The question is, does it really work? Who owns DALL-E images? Legal AI experts weigh in In another installment of what has become an ongoing text-to-image generator drama, senior writer Sharon Goldman explored the legal ramifications of tools like DALL-E 2. When OpenAI announced expanded beta access to DALL-E in July, the company offered paid subscription users full usage rights to reprint, sell and merchandise the images they create with the powerful text-to-image generator. A week later, creative professionals across industries were already buzzing with questions. Topping the list: Who owns images put out by DALL-E, or for that matter, other AI-powered text-to-image generators, such as Google’s Imagen? The owner of the AI that trains the model? Or the human who prompts the AI? Bradford Newman, who leads the machine learning and AI practice of global law firm Baker McKenzie, in its Palo Alto office, said the answer to the question “Who owns DALL-E images?” is far from clear. And, he emphasized, legal fallout is inevitable. Ray, the machine learning tech behind OpenAI, levels up to Ray 2.0 Sean Michael Kerner wrote this August 23 piece about the infrastructure that supports OpenAI: Ray. Over the last two years, one of the most common ways for organizations to scale and run increasingly large and complex artificial intelligence workloads has been with the open-source Ray framework, used by companies from OpenAI to Shopify and Instacart. Ray enables machine learning (ML) models to scale across hardware resources, and can also be used to support MLops workflows across different ML tools. The tool’s next major milestone debuted at the Ray Summit in San Francisco. Ray 2.0 extends the technology with the new Ray AI Runtime (AIR) that is intended to work as a runtime layer for executing ML services. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,624
2,022
"Top 5 stories of the week:  Hot IT skills, AIaaS levels the playing field, the enigma of healthcare AI and more | VentureBeat"
"https://venturebeat.com/ai/top-5-stories-of-the-week-hot-it-skills-aiaas-levels-the-playing-field-the-enigma-of-healthcare-ai-and-more"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Top 5 stories of the week: Hot IT skills, AIaaS levels the playing field, the enigma of healthcare AI and more Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. As 2022 winds down, some might say mercifully, VentureBeat readers are clearly thinking ahead. Whether you’re hiring to fill skills gaps or looking to find your next opportunity, you flocked to Drew Robb’s look at the hottest IT skills 2023, which dominated the top 5 list, garnering twice as many visits as the other four top stories combined. Robb not only includes the skills that are in demand, but adds the certifications that verify those skills. Is there an artificial intelligence (AI) divide? That is, are only large enterprises positioned to take advantage of the insights and innovations offered by AI? Maybe. Once. But the emergence of AI-as-a-service (AIaaS) is making the technology accessible to smaller companies without requiring them to build their own systems from scratch. Looking ahead again to 2023, our third most-read story this week is Sharon Goldman’s look at 23 AI predications for the coming year, ranging from “generative AI will transform enterprise applications” to “AI will empower more efficient devops.” Be sure to check out the other 21. Ashleigh Hollowell notes that 2022 was a big year for AI in healthcare, citing specific advances by GE Healthcare and Siemens. So why, she asks, do 50% of U.S. adults say they haven’t seen or experienced improvements in their own care as a result of medical AI advancements? VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Holding down the 5 th spot is Ben Dickson’s look at why 2022 was an eye-opening year for AI and deep learning. Here are the top five stories for the week of December 26 th. The hottest IT skills for 2023 – even in a recession For many of the hundreds of thousands who’ve been laid off by tech companies recently, this might well be considered, to borrow the words of Charles Dickens, the worst of times — especially with a recession looming. Yet there are plenty among them, and others in the workforce, who could consider this the best of times. Why? Because they possess the most in-demand skillsets and certifications. Despite the layoffs, cutbacks, tightening purse strings, and general doom and gloom presented in the media, these IT professionals can look forward to higher pay, plenty of offers, perpetual headhunting inquiries and even the occasional bidding war for their talents. AI-as-a-service makes artificial intelligence and data analytics more accessible and cost effective AIaaS is becoming an ideal option for anyone who wants access to AI without needing to establish an ultra-expensive infrastructure for themselves. With such a cost-effective solution available for anyone, it’s no surprise that AIaaS is starting to become a standard in most industries. An analysis by Research and Markets estimated that the global market for AIaaS is expected to grow by around $11.6 billion by 2024. 23 AI predictions for the enterprise in 2023 It’s that time of year again when AI leaders, consultants and vendors look at enterprise trends and make their predictions. After a whirlwind 2022, it’s no easy task this time around. You may not agree with every one of these — but in honor of 2023, these are 23 top AI and ML predictions experts think will be spot-on for the coming year. Healthcare AI is advancing rapidly, so why aren’t Americans noticing the progress? There’s no doubt that AI in healthcare had a very successful year. Back in October, the FDA added 178 AI-enabled devices to its list of 500+ AI technologies that are approved for medical use. Topping the list for most approved devices were two massive players in the healthcare technology space: GE Healthcare, with 42 authorized AI devices, and Siemens, with 29. However, despite the leaps and bounds made in the field thanks to these two giants, a recent survey from medical intelligence company, Bluesight, found that regardless of actual advancements made, around 50% of U.S. adults say they have not seen or experienced improvements in their own care as a result of medical AI advancements What we learned about AI and deep learning in 2022 It’s as good a time as any to discuss the implications of advances in AI. 2022 saw interesting progress in deep learning, especially in generative models. However, as the capabilities of deep learning models increase, so does the confusion surrounding them. On the one hand, advanced models such as ChatGPT and DALL-E are displaying fascinating results and the impression of thinking and reasoning. On the other hand, they often make errors that prove they lack some of the basic elements of intelligence that humans have. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,625
2,022
"How you can create a strong AI talent development strategy | VentureBeat"
"https://venturebeat.com/ai/how-you-can-create-a-strong-ai-talent-development-strategy"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest How you can create a strong AI talent development strategy Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Artificial intelligence (AI) is revolutionizing our way of life by automating decisions, predicting outcomes, and optimizing processes. From our phones to shopping, medication, banking and manufacturing, AI is everywhere. However, there is growing concern that advances in AI are being slowed down by a shortage of trained talent that’s needed to scale AI solutions across organizations. This talent shortage is slated to cause a massive imbalance in AI adoption and its scalability across the enterprise. But what is causing this shortage of talent? Is there really a shortage, or is the problem our inability to utilize talent effectively? There is much discussion across forums about the right enablement and talent strategy for AI. But the underlying problem is not the lack of skills but the lack of the right individuals connecting with the right opportunities. There are many extraordinary people in the market who would be perfect fits for a career in AI, but the industry simply is not doing enough to provide the right platform to launch their careers. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! That’s because there are no best practices and standards developed for the next generation of deep learning and AI skills, and adoption at most organizations is still nascent. Even several entrenched players do not have a strong talent development strategy in place for nurturing their existing AI/ML talent. An AI talent development strategy The solution lies in creating a strong talent development strategy along with the right platforms and frameworks for talent to be cultivated, by: Identifying those best fit for enablement programs: From backgrounds like mathematics, statistics, computer science and economics we can get a talent pool that is already acclimatized to structural problem-solving. Similarly, there are people with experience as data engineers, data scientists and machine learning (ML) experts who can be coached and mentored into AI roles with very little transition time. A proper filtration mechanism that selects candidates with the right aptitude and learning potential is key to solving the skills gap problem. Enabling career transitions: Apart from identifying the most suitable talent, there must be well-designed enablement programs to equip talent with the right skill sets. These enablement programs can take the form of bridge programs of short duration, or fully comprehensive training of six to eight months. Apart from that, creating customized growth plans that take aspirants closer to their desired career profile step by step will be another vital ingredient for the transition process. Building robust best-in-class in-house learning platforms: Developing learning platforms for upskilling and reskilling in niche areas is vital. These need to be learner-friendly and provide engaging content and a wide variety of resources and content to enrich the talent pool. These portals can be monitored through analytics. Personalized guidance can be offered to users for better engagement and better learning outcomes. Nurturing partnerships with startups, MOOC platforms: Companies need to invest in partnerships and training for employees with open-source experience and startups specializing in various AI domains. Through partnerships, two-way knowledge transfer is initiated, with mutual enrichment of talent a natural outcome. Nurturing partnerships with universities and think tanks: Collaboration with academia, universities and research organizations, AI consortiums and think tanks brings access to state-of-the-art training materials and research. Academia can also leverage industry feedback to tailor their courses to specific business needs. Initiating mentoring programs from experienced AI professionals: Engaging experienced professionals who can provide the much-needed support and knowledge to train the rest of the team is vital for disseminating the much-needed added skills and technical know-how. Equipping and designating trainers from within the team will cause faster learning and foster a learning culture within the team. Creating incentives: Focus on creating a proper incentive structure to nudge employees toward continuous upskilling. Sponsoring temporary gig projects and job rotation: Creating a support system for employees to work on side projects and hobby projects within the framework of their organization, as well as rotating job roles at proper intervals, is another strategy that can help bolster the skills and provide a better platform for talent development. Instituting hackathons and Ideathons: Hackathons are one of the best ways to get the talent pool hooked into cutting-edge technologies and to give them valuable knowledge. Employees participating in AI hackathons for knowledge-building can see what AI is all about and may become intrigued and want to get more involved. Creating a steady pipeline of entry-level talent: There are very few entry-level positions available in AI, which makes it hard to develop fresh talent. Many times, the recruitment process is not customized to identify potential candidates who could be trained easily, as hiring managers are not experienced in sourcing these easily trainable candidates. This causes deficiencies in building up a steady talent pipeline. Creating learning opportunities: Encouraging employees to contribute to technical white papers on AI topics, participating in knowledge sharing across various AI journals, participating in roundtables and working with industry analysts are some of the other avenues to create learning opportunities. Top skill sets most suited for transitioning to AI roles Reskilling/upskilling will ensure adequate scaling of enterprise AI and leveraging transferable skills that are relatable to AI. Today, the top transferable skills for an AI career are linear algebra, probability, statistics, ML algorithms, data science, programming, AIOps, text analytics, image analytics and data mining. In general, mathematics plays an important role in AI, and specifically in ML. Skills in applied mathematics in the areas of linear algebra, probability theory and statistics, multivariate calculus, algorithms and optimization are particularly relevant. As ML works with huge amounts of data, data science competencies help in predictive analytics, data modeling, analytics and other aspects of AI. There are also multiple programming languages to cater to the algorithms, libraries and frameworks in AI that cover text analytics, image analytics, compute deep learning and neural networks. Fixing the skills gap by focusing within the organization and bringing about internal transformation will take some patience and conscious effort. But this is an investment worth making, as creating a robust talent pool and pipeline will be one of the primary requirements for seizing the opportunities that the next generation of the AI revolution will provide. Balakrishna DR, popularly known as Bali, is the executive vice president and head of the AI and automation unit at Infosys. DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,626
2,022
"Conversation is the ultimate user interface | VentureBeat"
"https://venturebeat.com/ai/conversation-is-the-ultimate-user-interface"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Conversation is the ultimate user interface Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. We may be living in the golden age of information, but finding the right information is still a pain in the neck. To tackle this challenge, my team and I at Amazon Alexa are building what we believe is the next-generation user interface that will redefine how we interact with technology and find information. We spend hours every day hunched over phones and laptops. We open and close and reopen apps. We scroll. We type on tiny QWERTY keyboards. And we click through an endless sea of blue links every time we search the web. The Internet is indeed amazing. The user interface is not. We have accepted these conditions because, since the dawn of the digital age, this is all we have known. But these methods of interacting with the digital universe were developed in service of business models, not user experience. They are designed to increase the amount of time you spend online, drive click-throughs, and maximize engagement time. But it’s unfair to make humans find information this way. And it’s time to move on. Conversation: The age-old interface The first step is changing the way we interact with the Internet. And fortunately, recent advances in AI are making an entirely new user interface possible. In fact, it’s the original interface, the one we’ve been using for nearly two million years. It’s called “conversation.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Not speech, mind you. We’ve already been using that for almost a decade, interacting with our phones and digital assistants like Alexa. I’m talking about actual, human-like conversation. The kind you might have with a friend over a beer, in which vague or poorly worded questions are understood. Conversations in which intent is inferred and answers to questions are summarized and personalized. When two people converse, they understand each others’ context and incorporate visual cues. Conversations can be concise and efficient. Or they can range across a variety of topics, change direction, and lead to serendipitous discovery. Humans do this without even thinking about it. But to teach a machine to do this requires significant advances in the science of AI. This is not just about natural language processing ( NLP ) capabilities, which are improving rapidly with every voice interaction (Alexa alone gets more than a billion requests every week from hundreds of millions of devices in more than 17 languages.) AI within milliseconds Rather, for a machine to learn the give-and-take nature of conversation requires a fundamental rethinking of our current system of information retrieval, including the ability to crawl billions of web pages in real time (web-scale neural information retrieval), concisely summarize information from the enormity of the Web (automated summarization), and the ability to recognize an end-user’s intent and recommend additional relevant content (using contextual recommendation engines.) Conversational interfaces require these systems (and more) to work together seamlessly and instantly. For example, if you ask an AI assistant, “Where is the world’s oldest living tree?” it should be able to not only answer that question quickly and concisely but also understand that you are currently only an hour’s drive from said tree, and follow up with directions and recommendations on hiking trails in the area. Or if you’re watching the Dallas Cowboys on Thursday Night Football and vaguely ask, “Who just caught that pass?” it should be able to infer which game you’re watching, which team is on offense, who caught the pass and for how many yards. All within milliseconds. These are difficult, unprecedented problems. As such, Amazon has assembled a team of world-class AI scientists dedicated to solving them. We’re investing in these resources because we believe these capabilities represent the future of human-machine interaction. And we’re not the only ones. “These give-and-take interactions build relationships that will shape both the user and the system,” said Hae Won Park, a research scientist with MIT’s personal robots group. “Relational agents can disrupt domains like personal assistance, healthcare, aging, education, and more. We’re just beginning to realize the user benefits.” Moving toward “ambient intelligence” Indeed, conversational AI can benefit any company interested in changing the way their customers or employees interact with digital information. And like so many of the AI advances first developed in service of Alexa — like Amazon Lex and Amazon Polly — we fully expect to make these capabilities available to any company, in any industry, through the AI services available on AWS. The end goal is to shift the burden of retrieving and distilling relevant information from humans to AIs. And by embedding this conversational capability into the spaces we live and work — our kitchens, cars, and offices — we can reduce the amount of time we spend peering into phones and laptops. We call this concept “ambient intelligence,” in which AI is available everywhere around you, assists you when you need it, and even anticipates your needs, but fades into the background when you don’t need it. In other words, we can still benefit from the full awesomeness of the internet while spending far less time with it. As for the business models that depend on tiny screens, endless scrolling, and a sea of blue links? It’s time for them to adapt to us, not the other way around. Vishal Sharma is VP of Amazon Alexa AI Information. DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,627
2,023
"CES 2023 preview: What to expect at the Las Vegas tech extravaganza | VentureBeat"
"https://venturebeat.com/ai/ces-2023-preview-what-to-expect-at-the-las-vegas-tech-extravaganza"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages CES 2023 preview: What to expect at the Las Vegas tech extravaganza Share on Facebook Share on X Share on LinkedIn Aska has a flying car. It's an electric VTOL and car hybrid. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Next week at the CES 2023 trade show in Las Vegas , we’ll once again be able to see and hear about the latest tech trends in person. Last year’s event also took place in person, but it was a shadow of its former self as the Omicron wave of COVID-19 took its toll. The event only drew about 45,000 last year, far below the 175,212 that showed up in January 2019. This year, they’re expecting around 100,000 people to show up, according to Gary Shapiro, CEO of Consumer Technology Association , the trade group that stages the event. Last year’s event looked like a ghost town. At least it seemed that way in pictures, as I canceled my trip about a week before the show after everybody else canceled their appointments with me. The same kind of cancellation tsunami hasn’t happened this year. This year, the show gets underway on Tuesday morning with some virtual sideline events from Nvidia and Acer. I think it’s safe to say we’ll see plenty of new chips and PC gaming rigs announced. The difference this year is that, because of the recession, we can probably expect to buy more of these products sooner than we were able to do so during the shortages of the past couple of years. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! A few thousand press people will start converging on Las Vegas on Tuesday and Wednesday for media events that officially begin with a talk on tech trends by Steve Koenig, vice president of research at the CTA, followed by the opening CES Unveiled press event. If it’s your first time to the show, check out my tips and tricks for CES. On Wednesday, big companies like LG, Panasonic, Samsung, TCL and Sony will have their press conferences where they show off the new products coming this year. By Thursday, the main show floors will open with nearly 3,000 exhibitors across more than 2.1 million square feet of space, Shapiro said. That’s down from 4,500 exhibitors and 2.9 million square feet in January 2020. But it’s still a sizable show and it runs through Sunday. And while CES is still a place for the giants like Samsung and Sony, the show will have 1,000 new exhibitors, and it will also have its traditional Eureka Park startup space in the Venetian. The big trends Last week, PR Newswire reported that there was a 340% increase in mentions in CES 2023 press releases, with more than 66 mentions in 2022 compared to 2021. NFT mentions were up 200% to 12, up from four. Sustainability mentions were up 63% to 49, up from 30. Robots were mentioned 49 times, up only 2% from 48 a year ago. The losers? Augmented reality slipped 12% to 30 from 34, and wearable dropped 38% from 88 to 55. In our interview, Shapiro said we’ll see an “incredibly strong” health tech category. Shapiro expects we’ll see a lot of food tech, like from a company called Suvie, which cooks food while you’re away and is ready when you get home. Based on the pitches I’m getting, I think we’ll see a lot of tech related to AI, health wearables, energy-saving devices, the internet of things (IoT), sleep care, elder care, mental care, smart cars, robots and virtual reality (think Sony and HTC at the very least). AI AI is kind of an invisible product, but its presence will be visible everywhere at the show. Generative AI startups might be arriving on the scene too late to make a big presence at the show, but everyone from big companies to startups will be talking about how AI is fueling a new generation of products. Embodied, maker of the social interaction robot Moxie , uses interactions enabled by generative AI to help promote social, emotional and cognitive development in children ages five to 10. It is now shipping the robot that it has been showing for a couple of years. Square Off, a maker of smart chessboards with an adaptive AI, will be back to show off what it can with its partnership with Miko, a consumer robotics company that makes AI robots for kids. There are a lot of people concerned about the encroachment of AI into our lives. Alice Little, an award-winning legal courtesan currently operating out of the Chicken Ranch brothel near Las Vegas, said in a statement that AI tech will threaten the livelihood of sex workers, who have come to depend on digital experiences such as Only Fans during the pandemic. She believes these kinds of creator platforms could be significantly hurt by AI, particularly deepfake technology that creates fake images of real people. We’ll see this debate play out in industry after industry. Smart green vehicles on the rise AI is also going to get used in the latest vehicles. More than a dozen automakers will be at the show, and they’ll be showing off electric vehicles, assisted driving features and more. John Deere, which has one of the keynote slots for its CEO John May, will show off AI-based tractors that can operate around the clock. Candela is manufacturing a new electric “flying” C-8 consumer boat. It is an electric vehicle that rides above the waves and is a sibling to Candela’s P-12 electric hydrofoil ferry. Cenntro Electric Group is will unveil five new electric vehicles at CES, including a hydrogen-powered semi and the production version of its iChassis. And as you can see below, we’ll have our share of flying cars again. Aska will be showing off its first drive-and-fly electric vertical take-off and landing (eVTOL) aircraft/car hybrid. It is an electric vehicle that you can drive on the road like a car and fly in the air like a quadrocopter. Is it real? Let’s hope so. The metaverse Wherever AI gets mentioned, the metaverse can’t be far behind. And where the metaverse gets mentioned, Web3 will come up as well. It’s still early and a bit rare to see actual metaverse experiences on display at the show. Source Digital and Sansar announced they will show off a “metaverse experience” on smart TVs, where you can engage in VR-like 2D or 3D experiences from the makers of Sansar, which spun out of Second Life’s Linden Lab and was sold in 2020 to Wookey Projects. While we may not see a ton of actual metaverse projects at the show, we will hear people talking about it, especially in the visions presented in keynote speeches and panels. In fact, I’m going to moderate a panel at CES on Friday January 6 about how gaming will lead the metaverse. It will take place at 3 pm on Friday January 6 at LVCC North / N262 and feature speakers from Area15, Upland, Holoride and Tilt Five. With all this talk about the metaverse, Shapiro isn’t so worried that demand for an in-person CES will fall off. “We’re very optimistic that CES is so important to people around the world — as buyers, investors, suppliers, partners, media, consultants, market analysts — that we expect a strong showing in Las Vegas,” Shapiro said. “But the digital platform will be an important supplement to many of the people going to Las Vegas who are just too busy to get to the conferences or connect with the exhibitors that they want to. That’s what the platform will be for, to serve those people.” As for blockchain, OpenCarbon is touting its first non-fungible token (NFT) marketplace for carbon credits. The OpenCarbon platform delivers SEC-compliant, carbon offset financial products to efficiently source, construct, manage and retire large-scale and complex carbon asset portfolios. The internet of things LG is partnering with Asleep to create smart home appliances that respond to human breathing and diagnose sleep stages to operate at an optimal level. So your air conditioning and heating — and other home appliances — can respond to how you’re sleeping at night. We’ll see plenty of other examples of the internet of things creeping deeper into our lives. Let’s hope the theme of Ubisoft’s Watchdogs game — that these devices will be used by hackers and governments to spy on us — won’t come true. You can bet that the internet of things will further infiltrate its way into smart homes, smart buildings and smart whatevers. Artificial intelligence — buoyed by successes in computer vision, neural networks, and machine learning — will be pervasive within these IoT devices. VR and gaming gear Sony will take the stage at CES to share details about its PlayStation VR2 headset and the games coming for the device, which debuts on February 22. HTC is expected to make an announcement and a number of other VR headset makers are showing up at CES. And VR’s platform is always changing. bHaptics’s new TactGlove can be worn like regular gloves. It’s compatible with commercial VR/AR headsets with camera-based hand tracking systems, including Meta Quest 2, and costs $300 per pair. The new haptic gloves have 12 Linear Resonant Actuator-type motors (LRA), which the company says allows delicate and sophisticated feedback at the fingertips and wrists. Holoride will be at the show again display how you can play virtual reality games in the back seat of a car — without losing your lunch. On the gaming side, we’ll see LG’s 27-inch OLED gaming monitor and Alienware’s latest 18-inch gaming laptop. Nvidia will likely talk about the GeForce RTX 40-series graphics chips, and AMD will attempt to outdo Nvidia with its own GPUs. MSI will show a flagship gaming laptop Titan GT77 that will feature a 4K/144Hz Mini LED display. You can bet there will be a ton more on display at the show. Advanced Micro Devices CEO Lisa Su will kick off the show at 6:30 p.m. on Wednesday with a keynote speech. So we’re likely to hear about the latest Ryzen and Radeon chips. As for TVs, some aficionados are hoping to hear more about QD-OLED as the future of cutting-edge TV tech, so it’ll be interesting to see if anyone has anything to add on that front. Health tech Valencell unveiling the first calibration-free, cuffless finger-tip blood pressure monitor. Companies like Omron Healthcare have been whittling away at this kind of problem for years. Dexcom has received clearance from the FDA for its glucose monitor Dexcom 7, which makes it much easier to measure blood sugar levels for people with diabetes. Withings, a pioneer of connected health, will unveil its first foray into urine analysis with a miniaturized health lab that will change the way people monitor their health from the comfort and privacy of their own bathroom. If this works, you may not have to pee in a cup at the doctor’s office or a lab so much. While we urinate on average seven times a day, urine analyses are usually performed only once a year when requested by your health professional. Now Withings is making it possible to conduct health assessments from the comfort and privacy of your bathroom on a regular basis. And I’m expecting to see more solutions than ever that target elderly people and their caregivers, as tech has finally woken up to the demographic trend of the baby boomers get older. Tech isn’t going to be just for the young anymore. Japan’s Asilla is showing off a security camera that sends you alerts to unusual things happening in real time, like if someone falls on the ground. We can expect to see more of these products, as Synaptics has created fall-detection sensors that use AI to detect if people have fallen and need help. Food tech OneThird , a Dutch food tech company, will show a “ripeness checker” designed for use by grocers or grocery store shoppers. It lets them quickly scan an avocado and get accurate information about when it is ready for consumption – no squeezing necessary. Apparently we’re not so good at the squeezing part, as one-third of all food produced is wasted due to spoilage – costing upwards of $1 trillion. OneThird’s solution helps end-consumers get the freshest food and allows growers, food distributors and grocers to predict the shelf life of fresh produce. Typhur will show off its all-in-one smart Sous Vide cooking machine, with cooking classes and recipes from chefs all around the world, including Michelin Star Chefs. Retro trends If you’re not into all this new stuff, you can see old stuff too. RCA will be among the retro brands coming to the show, and it will show off home automation, e-bikes, scooters and retro-style electronics. For some of us, that brings back some memories. Weird uses for new technology I’ve been pitched some oddball things that I didn’t expect see. For instance, Glüxkind is showing off an AI-powered baby stroller with a motor. It comes with an sleep solution with rock-my-baby, white noise and 360 safety bubble around the stroller. I suppose I shouldn’t call this weird because it could be very useful for its target consumers. But it’s certainly something I didn’t expect see. Meanwhile, Osim is going to show off its uLove 3 Well-Being Chair, which uses AI to help give you massages. The chair is designed to measure, monitor, and manage stress while giving you a full-body massage. After picking up a video from your own library or by using a YouTube link, you can add scents on the timeline to set up the perfect smell, timing, and duration for a scent to go with your video. What’s missing from CES Some common CES events won’t happen. Nvidia will be doing an online press event, but it won’t be on the show floor. Apple doesn’t exhibit at the show, as it always does its own events. I expect we won’t see as many masks either. But I suggest everybody be extra safe with so many people concentrated in Vegas. With as many as 71,000 fewer people, I’m certainly hoping for fewer lines and crowds around the popular booths. But like many people, I’m curious what it will be like seeing people face-to-face again at a huge trade show. Hopefully it will feel like we’re all getting back to business. Tech for the rest of us I have some hope that there will be some interesting technologies from non-tech companies. Halio is showing off smart glass. You can put this on windows and building facades to prevent solar heat from cooking everybody inside a building. This reduces a facility’s carbon footprint by limiting energy used for air conditioning or heating. I like how it’s another example of how tech fades into the woodwork of everyday things. My favorite CES talk about tech from non-tech companies was a few years ago, when Arnold Donald — the CEO of the world’s largest cruise company, Carnival Cruises — unveiled the Ocean Medallion wearable. That was interesting because it was an example of how technology was infiltrating a non-tech business, where the technology faded into the woodwork and the woodwork itself got smart. Carnival is now outfitting its 100-plus cruise ships with the technology. In recent years, Procter & Gamble has also showed up with cool uses of tech in ordinary products, such as putting sensors and AI into products such as skin advisers, heated razors and more. It will be back this year with more products that promise the same kind of creativity as last year’s products, which included a blemish remover that worked well on my face. I could still use more of that, and I think we could all use tech that makes the current products that we use every day even better. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,628
2,022
"AI-as-a-service makes artificial intelligence and data analytics more accessible and cost effective              | VentureBeat"
"https://venturebeat.com/ai/ai-as-a-service-makes-artificial-intelligence-and-data-analytics-more-accessible-and-cost-effective"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI-as-a-service makes artificial intelligence and data analytics more accessible and cost effective Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. From self-driving cars to intuitive chatbots like OpenAI’s ChatGPT , artificial intelligence (AI) has made significant progress in the past decade. Enterprises are looking to implement a broad spectrum of AI applications, from text analysis software to more complex predictive analytics tools. But building an in-house AI solution makes sense only for some businesses, as it’s a long and complex process. With emerging data science use cases, organizations now require continuous AI experimentation and test machine learning algorithms on several cloud platforms simultaneously. Processing data through such methods need massive upfront costs, which is why businesses are now turning toward AIaaS (AI-as-a-service) , third-party solutions that provide ready-to-use platforms. The platform for modern analytics AIaaS is becoming an ideal option for anyone who wants access to AI without needing to establish an ultra-expensive infrastructure for themselves. With such a cost-effective solution available for anyone, it’s no surprise that AIaaS is starting to become a standard in most industries. An analysis by Research and Markets estimated that the global market for AIaaS is expected to grow by around $11.6 billion by 2024. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! AIaaS allows companies to access AI software from a third-party vendor rather than hiring a team of experts to develop it in-house. This allows companies to get the benefits of AI and data analytics with a smaller initial investment, and they can also customize the software to meet their specific needs. AIaaS is similar to other “as-a-service” offerings like infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), and software-as-a-service (SaaS), which are all hosted by third-party vendors. In addition, AIaaS models enclose disparate technologies, including natural language processing (NLP) , computer vision, machine learning and robotics; you can pay for the services you require and upgrade to higher plans when your data and business scale. AIaaS is an optimal solution for smaller and mid-sized companies to access AI capabilities without building and implementing their own systems from scratch. This allows these companies to focus on their core business and still benefit from AI’s value, without becoming experts in data and machine learning. Using AIaaS can help companies increase profits while reducing the risk of investment in AI. In the past, companies often had to make significant financial investments in AI in order to see a return on their investment. Moses Guttmann, CEO and cofounder of ClearML , says that AIaaS allows companies to focus their data science teams on the unique challenges to their product, use case, customers and other essential requirements. “Essentially, using AIaaS can take away all the off-the-shelf problem-solving AI can help with, allowing the data science teams to concentrate on the unique and custom scenarios and data that can make an impact on the business of the company,” Guttmann told VentureBeat. Guttmann said that the crux of AI services is essentially outsourcing talent, i.e., having an external vendor build the internal company’s AI infrastructure and customize it to their needs. “The problem is always maintenance, where the know-how is still held by the AI service provider and rarely leaks into the company itself,” he said. “AIaaS on the contrary, provides a service platform, with simple APIs and access workflows, that allows companies to quickly adapt off-the-shelf working models and quickly integrate them into the company’s business logic and products.” Guttmann says that AIaaS can be great for tech organizations either having pretrained models or real-time data use cases, enhancing legacy data science architectures. “I believe that the real value in ML for a company is always a unique combination of its constraints, use case and data, and this is why companies should have some of their data scientists in-house,” said Guttmann. “To materialize the potential of those data scientists, a good software infrastructure needs to be put in place, doing the heavy lifting in operations and letting the data science team concentrate on the actual value they bring to the company.” A lean innovation for business requirements AIaaS is a proven approach that facilitates all aspects of AI innovation. The platform provides an all-in-one solution for modern business requirements, from ideating on how AI can provide value to actual, with a scaled implementation across a business as a target – to tangible outcomes in a matter of weeks. AIaaS enables a structured, beneficial way of balancing data science, IT and business consulting competencies, as well as balancing the technical delivery with the role of ongoing change management that comes with AI. It also decreases the risk of AI innovation, improving time-to-market, product outcomes and value for the business. At the same time, AIaaS provides organizations with a blueprint for AI going forward, thereby accelerating internal know-how and ability to execute, ensuring an agile delivery framework alignment, and transparency in creating the AI. “AIaaS platforms can quickly scale up or down as needed to meet changing business needs, providing organizations with the flexibility to adjust their AI capabilities as needed,” Yashar Behzadi, CEO and founder of Synthesis AI , told VentureBeat. Behzadi said AIaaS platforms can integrate with a wide range of other technologies, such as cloud storage and analytics tools, making it easier for organizations to leverage AI in conjunction with other tools and platforms. “AIaaS platforms often provide organizations with access to the latest and most advanced AI technologies, including machine learning algorithms and tools. This can help organizations build more accurate and effective machine learning models because AIaaS platforms often have access to large amounts of data,” said Behzadi. “This can be particularly beneficial for organizations with limited data available for training their models.” Current market adoption and challenges AIaaS platforms can process and analyze large volumes of text data, such as customer reviews or social media posts, to help computers and humans communicate more clearly. These platforms can also be used to build chatbots that can handle customer inquiries and requests, providing a convenient way for organizations to interact with customers and improve customer service. Computer vision training is another large use case, as AIaaS platforms can analyze and interpret images and video data, such as facial recognition or object detection; this can be inculcated in various applications, including security and surveillance, marketing and manufacturing. “Recently, we’ve seen a boom in the popularity of generative AI , which is another case of AIaaS being used to create content,” said Behzadi. “These services can create text or image content at scale with near-zero variable costs. Organizations are still figuring out how to practically use generative AI at scale, but the foundations are there.” Talking about the current challenges of AIaaS, Behzadi explained that company use cases are often nuanced and specialized, and generalized AIaaS systems may need to be revised for unique use cases. “The inability to fine-tune the models for company-specific data may result in lower-than-expected performance and ROI. However, this also ties into the lack of control organizations that use AIaaS may have over their systems and technologies, which can be a concern,” he said. Behzadi said that while integration can benefit the technology, it can also be complex and time-consuming to integrate with an organization’s existing systems and processes. “Additionally, the capabilities and biases inherent in AIaaS systems are unknown and may lead to unexpected outcomes. Lack of visibility into the ‘black box’ can also lead to ethical concerns of bias and privacy, and organizations do not have the technical insight and visibility to fully understand and characterize performance,” said Behzadi. He suggests that CTOs should first consider the organization’s specific business needs and goals and whether an AIaaS solution can help meet these needs. This may involve assessing the organization’s data resources and the potential benefits and costs of incorporating AI into their operations. “By leveraging AIaaS, a company is not investing in building core capabilities over time. Efficiency and cost-saving in the near term have to be weighed against capability in the long term. Additionally, a CTO should assess the ability of the more generalized AIaaS offering to meet the company’s potentially customized needs,” he said. What to expect from AI-as-a-service in 2023 Behzadi says that AIaaS systems are maturing and allowing customers to fine-tune the models with company-specific data, and this expanded capability will enable enterprises to create more targeted models for their specific use cases. “Providers will likely continue to specialize in various industries and sectors, offering tailored solutions for specific business needs. This may include the development of industry-specific AI tools and technologies,” he said. “As foundational NLP and computer vision models continue to evolve rapidly, they will increasingly power the AIaaS offerings. This will lead to faster capability development, lower cost of development, and greater capability.” Likewise, Guttmann predicts that we will see many more NLP-based models with simple APIs that companies can integrate directly into their products. “I think that surprisingly enough, a lot of companies will realize they can do more with their current data sScience teams and leverage AIaaS for the ‘simple tasks.’ We have witnessed a huge jump in capabilities over the last year, and I think the upcoming year is when companies capitalize on those new offerings,” he said. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,629
2,022
"AI adoption lagging? It may be a poor UI | VentureBeat"
"https://venturebeat.com/ai/ai-adoption-lagging-it-may-be-a-poor-ui"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI adoption lagging? It may be a poor UI Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Think about it: If something is slow, clunky, overly complicated, messy or inconsistent, you’re not going to want to use it, right? Well, the same goes for artificial intelligence (AI) platforms. In adopting them, organizations can tend to be attracted to looks — the thinking being that a shiny, sleek design will encourage use. Really, though, AI is only as good as its user interface (UI), said Petr Baudis, CTO, chief AI architect and cofounder of intelligent document processing (IDP) platform Rossum. Although it may seem counterintuitive, AI’s full potential comes down to functionality and simplicity. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “For AI technology to be widely adopted at an organization, it must have a strong UI,” said Baudis. “Otherwise, its usage frequency will fall short, and the volume of recurring customers will decline.” Infinite value — if correctly implemented At this point, AI is just about everywhere — you’d be hard pressed to find a use case where the technology isn’t applied in some way. “AI has completely transformed how people work, what they do and how they spend their time — both from a personal and work perspective,” said Baudis. “The value of AI is infinite, and its usage is increasing day-in and day-out.” And it will only become ever more omnipresent: The AI market size is projected to grow from $86.9 billion in 2022 to $407 billion by 2027, representing a compound annual growth rate (CAGR) of more than 36%. As users become more appreciative of its ability to reduce human error, AI will become increasingly advanced and powerful — which is fundamental to its very nature, said Baudis. AI needs training, data and regular practice and use to fix problems, learn, improve and become smarter. “This helps boost the value of AI technology immensely,” said Baudis. And “ friendly UI ” is critical to this — ease of use makes people come back, apply the technology daily and even recommend it to their peers. Ultimately, proactive AI use only benefits organizations. According to a recent survey by BCG and MIT Sloan Management Review , when workers derive personal value from the technology, there is a 5.9 times increased likelihood of organizational value. And employees who derive personal value from AI are 3.4 times likely to be more satisfied in their jobs. AI worth its salt As an example, Baudis pointed to a dedicated AI engine that Rossum built for Morton Salt. As the leading provider of salt in North America, the company receives a steady stream of purchase orders from a variety of trading and manufacturing customers, he explained. And for a long time, each document set had its own format, making the data entry process tedious and time consuming. But since integrating Rossum, Morton saves as much as 95% of time per document; its average time for processing documents is 10 seconds, said Baudis. The engine, he said, “was built and trained specifically on their purchase orders, and to this day, is always adapting by human data validation.” Simplified, streamlined In assessing AI design elements, organizations should look to their building, layout and streamlining capabilities, Baudis advised. For example, if a UI is “overwhelming,” with excessive dropdown options or a cluttered interface, “companies are likely set up to fail.” UI designers must ensure that interfaces are, as he put it, “easy to navigate, cohesive, use clear messaging and accommodate users of varying skill levels.” It comes down to striking the right balance between looks and functionality. Don’t overcomplicate and make it difficult for users to perform common tasks. At the same time, ensure that brand identity is fluid throughout a UI. “Negative UI experiences can harm company reputations, so organizations must keep this top of mind when building out their AI technology’s UI,” said Baudis. Regular training, feedback Encouraging employees to use AI requires a deep understanding of the technology’s value coupled with training, said Baudis. Organizations must be up front about their AI strategy and share how it will positively impact employees’ day-to-day tasks and overall experience. Also, companies must be transparent about goals, and illustrate real-life impact by showing the metrics that have improved since implementation. They must also provide continued training when new features are added. Furthermore, an internal feedback system can promote sharing of best practices, said Baudis. Whether it be regular check-ins, a dedicated Slack channel or a live best practices document, consistent communication about how to use AI most effectively will improve overall performance and adoption rates. Also, a customer feedback loop is key. “All customers and their preferences must be kept in mind when working toward building a strong UI,” said Baudis, “especially when the goal of AI is to make lives easier and better.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,630
2,023
"6 healthcare AI predictions for 2023 | VentureBeat"
"https://venturebeat.com/ai/6-healthcare-ai-predictions-for-2023"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages 6 healthcare AI predictions for 2023 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. In 2022, healthcare AI funding dropped to its lowest level since the third quarter of 2020, according to a CBInsights report. However, amid the economic downturn, 2022 has arguably been the year of AI innovation across several industries, including healthcare — which is why the sector was impacted less than others by the fall in global AI funding this year. CBInsights’ report revealed healthcare AI funding decreased 20% from Q2, compared to fintech AI funding, which dropped 34%, and retail tech AI funding, which fell by nearly half. As the new year beckons, AI experts are examining 2022’s trends to predict what to expect in 2023. VentureBeat spoke to several to get a sense of where healthcare AI might be heading: VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! 1. Personalized healthcare will be an even greater focus Several experts believe the coming year will usher in a greater drive for personalized healthcare , aided by increasing volumes of data in the industry. IDC predicts that “the global datasphere will grow from 33 zettabytes of data in 2018 to 175 zettabytes by 2025” — a fivefold increase from 2018’s figures. Morris Laster, medical investments partner at Israeli venture capital firm OurCrowd , told VentureBeat that this increase in data will put the spotlight on personalized healthcare. “For AI systems to make accurate predictions and recommendations, they must be trained on large amounts of high-quality data,” said Laster. While he admits that many healthcare datasets are incomplete, noisy or biased — which can lead to inaccurate or unreliable models — he said that organizations leveraging AI for healthcare can address the challenge by carefully curating and verifying the data used to train their models. “This can involve cleaning and preprocessing the data to remove any errors or inconsistencies, as well as conducting quality checks to ensure that the data is representative and accurate,” he added. Micha Breakstone, cofounder and CEO at Neuralight , believes AI’s range in healthcare will get increasingly personalized, so “specific treatment can be generated based on the patient’s profile, genetics, environment and lifestyle in order to optimize patient outcomes.” Andy Thurai, VP and principal analyst at Constellation Research , told VentureBeat that precision medicine or individualized medicine is an area where AI can help. Until now, he said, it was almost impossible to configure the treatment, medication combination and drug mixing that can work for a specific patient, because of the data involved to create this individualized profiling. But given today’s advancements in genomic profiling, he explained, AI can determine what treatments will work, and what won’t, based on an individual’s genetic profile, past treatment history or current medication history and develop a customizable treatment for every patient — rather than relying on a pharmacist or doctor. “The availability of wearable medical devices, telehealth consultations and predictive diagnosis using collected data in real time can all lead to identifying the right person for treatment while taking unnecessary treatments out of the equation, which can free up the doctor’s and other healthcare professionals’ time to [spend] where it is needed,” Thurai said. “In addition, remote health monitoring is easier now with wearable devices and AI escalating the issue only when necessary to schedule an immediate appointment.” 2. Legislation and regulations around healthcare AI will improve The healthcare industry creates a vast amount of data. One report by IDC estimated that the industry “created 2,000 exabytes of data in 2020 and will continue to grow at a 48% rate year over year.” This opens up unique data challenges in the healthcare industry, especially around data privacy. McKinsey debates the extent to which regulations will affect approaches to ethics, health data and patient confidentiality, and whether legislation will help the AI sector in the same way the General Data Protection Regulation (GDPR) has aided privacy protection. But Ayanna Charles, solutions consultant at predictive software company Verikai , believes that legislation and regulations will get better next year. “In the short term, legislative and regulatory bodies at both the state and national levels are getting more aggressive in their oversight of data sharing and the use of AI in healthcare,” she said. “Large insurers and healthcare providers are inherently conservative, so we expect more guidance regarding what constitutes acceptable use of data and AI.” Charles added that the data produced will keep growing in the long term, making the value of AI greater. It will also prompt governments, industry and advocacy groups to “consolidate around a common framework and set of practices that balance the need to protect individuals’ data with the real medical benefits of using that data within AI models,” she explained. 3. There will be more efforts to tackle AI bias While every industry using AI must address the big issue of bias in their models, a study by Harvard notes that it is particularly essential for healthcare providers to tackle AI bias head-on. “Biases in healthcare AI can further worsen social inequalities and can cause death,” according to the study. But Charles noted that 2023 will see the beginning of a reduction in biases for AI in healthcare. “Anybody using AI in healthcare must consider the bias that exists within the health system as it exists today before building and deploying any models,” she said. “At worst, they may blindly build and deploy a model that propagates and reinforces existing bias. At best, though, AI models can be used to reduce and remove discrepancies within the health system.” To address ethical concerns, organizations using AI for healthcare can implement guidelines and oversight mechanisms to ensure that their AI systems are being used responsibly and in compliance with relevant regulations, Lester explained. Svetlana Sicular, research VP at Gartner , added that “responsible AI helps achieve fairness, even though biases are baked into the data; gain trust, although transparency and explainability methods are evolving; and ensure regulatory compliance, while grappling with AI’s probabilistic nature.” 4. A wider range of applications in healthcare Accenture estimates that AI applications will cut annual U.S. healthcare costs by $150 billion in 2026. These savings are expected result from a wider range of applications. Taking a long-term look at the healthcare industry, OurCrowd medical analyst Tzvi Bessler said AI will continue to play an increasingly important role in healthcare over the next five years — beginning next year. “This will likely involve the development of more sophisticated and intelligent AI systems that can make more accurate and reliable predictions and recommendations,” said Bessler. Laster agrees: “In 2023, expect to continue to see AI being used more extensively for tasks such as drug discovery and development, and for improving the efficiency and accuracy of medical research.” Thurai said he also believes AI will greatly help drug and treatment discovery. “This has already started happening and there will be more next year,” he said. He added that, “in a situation like the recent COVID-19 pandemic , given the enormity of affected patients and the speed at which the virus spread, the drug and vaccination discovery and experimentation in clinical trials cannot take the normal lifecycle.” On average, he explained, a vaccination “can take years to bring to the market, but the COVID vaccine was mass-produced in billions and saved many lives using AI-assisted features such as discovery, experimentation, side effect, and efficiency tracking and so on — which would have been impossible to track using normal methods.” 5. A closer working relationship between humans and AI While some tout that AI will replace humans across industries, Charles doesn’t see that happening in the healthcare space, at least in the next year. Instead, she foresees a closer nexus between humans and machines. “As with all new technologies, AI technologies should not be treated as the single solution to the business problem facing an organization,” she explained. “Implementing any AI solution requires technologists to establish and maintain a sound governance structure — one that marries the solution with human oversight, consistent policies and solid procedures.” In fact, Gartner expects that “by 2023, all personnel hired for AI development and training work will have to demonstrate expertise in responsible AI , heralding a closer relationship between humans and machines.” Thurai added that there must be easy and practical mechanisms for overriding AI decisions. “Many managers and executives already working with AI admit they have had to intervene in their systems due to delivery of erroneous or unfair results,” said Thurai. “As more experts emphasize the need for a better human-AI relationship, we will begin to see organizations prioritizing this next year, especially in an industry like healthcare.” 6. Automation, automation and more automation According to Laster, 2023 will be the year of more automation in healthcare. He noted that we can expect to see AI being used more extensively for tasks such as managing patient records, scheduling appointments and coordinating care. “This could help improve the efficiency and effectiveness of healthcare delivery and could also provide patients with access to more personalized and convenient care,” he said. A recent analysis by McKinsey suggests that “AI-enabled personal assistants can automate 50 to 75% of manual tasks, boosting efficiency, reducing costs and freeing clinicians to focus on complex cases and actual care delivery and coordination. This, in turn, may improve the healthcare experience for both clinicians and insurance plan members.” A promising future for healthcare AI While Thurai concedes that it will take some time until AI systems can reflect the empathy that steers many human decisions, that doesn’t mean such systems shouldn’t be continuously improved to better mimic human values. “AI only reflects the programming and data that goes into it, and business leaders need to be aware that cold, data-driven insights are only part of the total decision-making process,” he said. Undoubtedly, there will be increased use of AI in healthcare next year, as more applications come to the fore and organizations begin to prioritize responsible AI. In Laster’s words, “overall, the future of AI in healthcare looks promising.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,631
2,023
"10 digital twin trends for 2023 | VentureBeat"
"https://venturebeat.com/ai/10-digital-twin-trends-for-2023"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages 10 digital twin trends for 2023 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Interest in digital twins has picked up over the last year. Digital twin tools are growing in capability, performance and ease of use. They are also taking advantage of promising formats like USD and glTF to connect the dots among different tools and processes. Advances in techniques for combining models can also improve the accuracy and performance of hybrid digital twins. Generative AI techniques used for text and images may also help create 3D shapes and even digital twins. These kinds of advances will allow enterprises to mix and match modeling capabilities in new ways and for new tasks. Here are 10 trends to watch for in the year ahead. 1. From connecting files to connecting data Over the last several years, all the major tools for designing products and infrastructure have been moving to the cloud — but still using legacy file formats to exchange data. Increasingly vendors are calling out the data integration aspects of these tools that make it easier to share digital twins across different tools and services. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! This capability often starts as a subset of a vendor’s tools. For example, Siemens is rebranding a new subset of its tools as part of Siemens Xcelerator , while Bentley has launched Phase 2 of the infrastructure metaverse. In November, location intelligence leader Trimble launched Trimble One , a “purpose-built connected construction management offering that includes rich field data, estimating, detailing, project management, finance and human capital management solutions.” It’s one thing to move apps to the cloud simply. These innovators are doing something else: pioneering more efficient ways to connect data across these apps. Over the next year, the other major construction and design tools providers will likely announce similar advances for connecting digital twins and digital threads across different processes. 2. Entertainment firms target the industrial metaverse Epic and Unreal have made significant progress partnering with digital-twin leaders to provide a better user experience across devices. These companies have announced significant partnerships with GIS, construction and automobile leaders. Blackshark AI developed the globe behind Microsoft’s latest flight simulator, and went on to scale the tech for automatically transforming raw satellite imagery into labeled digital twins. In April, Maxar , a leading satellite imaging provider, announced a significant investment in Blackshark for Earth-scale digital twins. Over the next year, more gaming and entertainment companies will find opportunities in the industrial metaverse, which ABI expects to eclipse the consumer metaverse over the next several years. 3. Nvidia galvanizes support for USD Pixar pioneered the Universal Scene Description (USD) format to improve movie production workflows. Nvidia has championed USD to connect the dots across various digital twins and industrial metaverse use cases. The company has built connectors to the IFC standard for buildings, and is improving workflows for Siemens in industrial automation and Bentley in construction. USD still lacks support for physics, materials and rigging, but despite its limitations, there is nothing better for organizing the 3D information for giant digital twins. Nvidia’s pioneering work on USD promises to integrate raw data with various industry, medicine and enterprise workflows. 4. glTF simplifies digital-twin exchange There is growing momentum behind the glTF file format for exchanging 3D models across different tools. The Khronos Group calls it the JPEG for the metaverse and digital twins. Expect gITF to pick up steam, particularly as creators look for an easy way of sharing interactive 3D models across tools. 5. Generative AI meets digital twins Over the last year, the world has been wowed by how easy it is to use ChatGPT to write text and Stable Diffusion to create images. Meanwhile, others have demonstrated new multimodal tools like DeepMind’s Gato for harmonizing models across text, video, 3D and robotic instructions. Over the next year, we can expect more progress in connecting generative AI techniques with digital twin models for describing not only the shape of things but how they work. Yashar Behzadi, CEO and founder of Synthesis AI , a synthetic data tools provider, said, “This emerging capability will change the way games are built, visual effects are produced and immersive 3D environments are developed. For commercial usage, democratizing this technology will create opportunities for digital twins and simulations to train complex computer vision systems, such as those found in autonomous vehicles.” 6. Hybrid digital twins There are a variety of performance, accuracy and use case tradeoffs among the models used in digital twins. Prith Banerjee, CTO of Ansys , believes that in 2023 enterprises will find new ways to combine different approaches to hybrid digital twins. Hybrid digital twins make it easier for CIOs to understand the future of a given asset or system. They will enable companies to merge asset data collected by IoT sensors with physics data to optimize system design, predictive maintenance and industrial asset management. Banerjee foresees more and more industries adopting this approach with disruptive business results in the coming years. For example, a healthcare company can develop an electrophysiology simulation of a heartbeat as the muscles contract, the valves open and the blood flows between the heart’s chambers. The company can then take a patient’s MRI scan and develop a simulation of that specific individual’s heart and how it would react to the insertion of a particular pacemaker model. If this R&D work is successful, it could help medical device and equipment companies invent new products and apply for FDA trials by demonstrating in-silico trials. 7. FDA modernization act replaces animals with silicon Animal testing has been a requirement for all new drugs and treatments since the FDA’s early days. This year, the U.S. Congress passed the FDA Modernization Act 2.0, allowing pharmaceutical companies to replace animal testing with in-vitro and in-silico methods. This will drive innovation and commercialization of patients-on-a-chip and better medical digital twins for testing more cost-effectively and humanely. Tamara Drake, director of research and regulatory policy at the Center for Responsible Science , told VentureBeat, “We believe in-silico methods, including use of artificial intelligence in conjunction with advance organs on a chip, or patient-on-a-chip, will be the biggest trend in drug development in coming years.” 8. Digital twin ecosystems open new use cases Matt Barrington, emerging technology leader at EY Americas , predicts that digital twins will increasingly transform how we run companies in 2023. For example, using a digital market twin to evaluate new products will support management and strategic decision-making. Digital twins will also underpin supply chain resilience in uncertain times, and improve risk management, safety and sustainability. This transformation will require increased emphasis on foundational digital capabilities in data management and devops for data engineering, as well as a more comprehensive approach to security. Barrington predicts fragmentation and a high degree of specialization in the market, such that no single vendor has an end-to-end digital twin solution. Companies will have to integrate several capabilities to create the right fit-for-purpose solution for their business. Part of that approach will require more composable, open architectures and the ability to curate an ecosystem-based system. 9. Enterprise digital twins take off Vendors have made significant advances in tools for process mining and process capture to create a digital twin of the organization. Bernd Gross, CTO at Software AG , said these advances allow enterprises to create simulations for an entire department or a cluster of business processes rather than a single business process. Leaders will find ways to incorporate various technologies, such as process mining, risk analysis and compliance monitoring, to drive more accurate outcomes. These techniques require greater breadth and depth of data. Today, enterprises must include relevant KPIs, causalities between processes, the life cycle of a business unit and more to create a genuinely accurate enterprise digital twin. 10. Digital twins drive 5G 5G delivers significantly faster speeds in direct view of one of the newer towers, but can be slower than 4G in the radio shadow zone. Cellular service providers are engaged in a race to fill in these shadows , and digital twins could help. Fortune Business Insights estimates that the market for 5G cells could grow by 54.4% annually through 2028. Mike Flaxman, spatial data science lead at Heavy AI , said many telcos are looking at digital twins to shift to a plan, build, and operate model that allows them to maximize service while cutting costs. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,632
2,021
"Cycode raises $56M to scan apps for security vulnerabilities | VentureBeat"
"https://venturebeat.com/uncategorized/furtcycode-raises-56m-to-scan-apps-for-security-vulnerabilities"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Cycode raises $56M to scan apps for security vulnerabilities Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Cycode , an app security company, today announced that it raised $56 million in a series B round led by New York-based Insight Partners with participation from YL Ventures. The proceeds, which bring Cycode’s total raised to $81 million, will be put toward supporting sales and product development and launching new technology partnerships, CEO Lior Levy said, as well as expanding the company’s integrations to include third-party security tools. The demand for app security solutions is on the rise as enterprises experience increasing cyberattacks. According to Contrast Security, as of January and February of this year, 11% of web apps contained 15 or more security vulnerabilities. Open source software is contributing to the problem, with a Synopsys report finding that 82% of commercial codebases have open source components in them that are more than four years out of date. Cycode was launched in 2019 by Levy and Ronen Slavin, both of whom started their cybersecurity careers in the Israel Defense Forces. Slavin is the founder of data encryption startup FileLock, which was acquired by Reason Cybersecurity in 2018. Levy had the idea for Cycode while working for Symantec as a solutions architect. “With so many new tools being adopted to support DevOps and continuous integration/continuous deployment initiatives, it was becoming impossible to assure that the governance and security policies of each tool met the corporate standard,” he told VentureBeat via email. “Plus, enterprises typically had multiple development teams that often used different tools, and with high levels of M&A activity in software, it was common for even more teams with even more tools to join the fray.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Cycode’s platform applies security and governance policies across app development tools and infrastructure. By drawing on a knowledge graph of customers’ software lifecycles, Cycode attempts to detect anomalous behavior that should arouse suspicion in any development environment. A knowledge graph represents a network of entities — i.e., objects, events, situations, or concepts — and illustrates the relationships between them. The data is usually stored in a database and visualized as a graph structure, hence the word “graph.” “The key to modern app security is centralizing and mapping events and metadata … such that it becomes easy to determine when disparate activities add meaningful context to each other,” Levy said. “With each new integration, our knowledge graph becomes smarter. Hence, one of our goals is to integrate with every software delivery and app security tool to determine how each dot is connected and when it’s relevant.” Leveraging analytics in security Just one vulnerability scan turns up a security flaw in 83% of apps, according to Veracode. The more frequent the scans, the better. Edgescan reports that it takes an average of 50.5 days for organizations to remediate vulnerabilities in public apps. Cycode’s tool aims to prioritize risk; prevent code tampering, leaks, and misconfigurations; and automate remediation in workflows while remaining non-intrusive. Security scanning tools, both from Cycode and third parties, can derive insights and context from the knowledge graph, which includes a mapping of security violations, user activity, and other events. According to Levy, the pandemic has increased the need for — and complexity of — strong authentication, driving demand for solutions like Cycode. “Embracing remote work has meant that organizations can no longer rely on ‘being on the network’ as a factor [of] authentication. Moreover, as more developers not only work from home but actually have taken advantage of the pandemic to work and travel, other security measures such as IP range restrictions have become more complicated,” he said. “Augmenting the current capabilities with AI is on the roadmap for 2022 so that Cycode’s knowledge graph will learn the intricacies of each unique software delivery pipeline in order to identify custom anomalies for each environment.” Growing market According to the European Union’s Agency for Cybersecurity, supply chain attacks are expected to increase 400% between last year, 2020, and this year, 2021. Furthermore, Gartner predicts by 2025, 45% of organizations worldwide will have experienced attacks on their software supply chains — a threefold increase from 2021. Against this backdrop, startups in cybersecurity are securing record amounts of venture capital. In July, Safe Security raised $33 million for its platform to manage and mitigate cyber risk. Just a few months earlier, app security platform provider Pathlock nabbed $20 million in venture backing. And in the spring, Aqua Security, which protects containerized apps and infrastructure, closed a $135 million financing round. The cybersecurity market was valued at $156.24 billion in 2020 and is expected to reach $352.25 billion by 2026, according to Mordor Intelligence. Cycode says that it has “dozens” of customers, including Fortune 500 companies. Annual recurring revenue at the 55 employee company grew seven times in Q1 2021, Levy claims. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,633
2,021
"Amazon Web Services unveils enhanced cloud vulnerability management | VentureBeat"
"https://venturebeat.com/uncategorized/amazon-web-services-unveils-enhanced-cloud-vulnerability-management"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon Web Services unveils enhanced cloud vulnerability management Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Amazon Web Services (AWS) today announced several new features for improving and automating the management of vulnerabilities on its platform, in response to evolving security requirements in the cloud. Newly added capabilities for the Amazon Inspector service will meet the “critical need to detect and remediate at speed” in order to secure cloud workloads, according to a post on the AWS blog , authored by developer advocate Steve Roberts. The announcement came in connection with the AWS re:Invent conference, which began today. In a second security announcement, AWS unveiled a new secrets detector feature for its Amazon CodeGuru Reviewer tool, aimed at automatically detecting secrets such as passwords and API keys that were inadvertently committed in source code. The security updates from AWS come as enterprises continue their accelerated shift to the cloud, even as security teams have struggled to keep up. Gartner estimates 70% of workloads will be running in public cloud within three years, up from 40% today. But a recent survey of cloud engineering professionals found that 36% of organizations suffered a serious cloud security data leak or a breach in the past 12 months. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Changing cloud security needs In the post about the Amazon Inspector updates, Roberts acknowledged that “vulnerability management for cloud customers has changed considerably” since the service first launched in 2015. Among the new requirements are “enabling frictionless deployment at scale, support for an expanded set of resource types needing assessment, and a critical need to detect and remediate at speed,” he said in the post. Key updates for Amazon Inspector announced today include assessment scans that are continual and automated — taking the place of manual scans that occur only periodically — along with automated resource discovery. “Tens of thousands of vulnerabilities exist, with new ones being discovered and made public on a regular basis. With this continually growing threat, manual assessment can lead to customers being unaware of an exposure and thus potentially vulnerable between assessments,” Roberts wrote in the post. Using the updated Amazon Inspector will enable auto discovery and begin a continual assessment of a customer’s Elastic Compute Cloud (EC2) and Amazon Elastic Container Registry-based container workloads — ultimately evaluating the customer’s security posture “even as the underlying resources change,” he wrote. More feature updates AWS also announced a number of other new features for Amazon Inspector, including additional support for container-based workloads, with the ability to assess workloads on both EC2 and container infrastructure; integration with AWS Organizations, enabling customers to use Amazon Inspector across all of their organization’s accounts; elimination of the standalone Amazon Inspector scanning agent, with assessment scanning now performed by the AWS Systems Manager agent (so that a separate agent doesn’t need to be installed); and enhanced risk scoring and easier identification of the most critical vulnerabilities. A “highly contextualized” risk score can now be generated through correlation of Common Vulnerability and Exposures (CVE) metadata with factors such as network accessibility, Roberts said. Secrets detector Meanwhile, with the new secrets detector feature in Amazon CodeGuru Reviewer, AWS addresses the issue of developers accidentally committing secrets to source code or configuration files, including passwords, API keys, SSH keys, and access tokens. “As many other developers facing a strict deadline, I’ve often taken shortcuts when managing and consuming secrets in my code, using plaintext environment variables or hard-coding static secrets during local development, and then inadvertently commit them,” wrote Alex Casalboni, developer advocate at AWS, in a blog post announcing the updates for CodeGuru Reviewer. “Of course, I’ve always regretted it and wished there was an automated way to detect and secure these secrets across all my repositories.” The new capability leverages machine learning to detect hardcoded secrets during a code review process, “ultimately helping you to ensure that all new code doesn’t contain hardcoded secrets before being merged and deployed,” Casalboni wrote. AWS re:Invent 2021 takes place today through Friday, both in-person in Las Vegas and online. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,634
2,021
"Access control: Cerbos brings open source to user permission software | VentureBeat"
"https://venturebeat.com/uncategorized/access-control-cerbos-brings-open-source-to-user-permission-software"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Access control: Cerbos brings open source to user permission software Share on Facebook Share on X Share on LinkedIn Cerbos cofounders Emre Baran (CEO) and Charith Ellawala (CTO) Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Let the OSS Enterprise newsletter guide your open source journey! Sign up here. A new company is setting out to streamline how software developers and engineers manage user permissions in their software, while also addressing the myriad access control compliance requirements driven by regulations and standards such as GDPR and ISO-27001. Cerbos is applying a self-hosted, open source approach to the user permissions problem, one that works across languages and frameworks — and crucially, one that gives companies full visibility into how it’s handling user data. To help build out its team and develop a commercial product on top of the open source platform, Cerbos today announced it has raised $3.5 million in a seed round of funding led by London-based VC firm Crane. IAM what I am It has been a bumper year in the identification and access management ( IAM ) realm, with Okta snapping up Auth0 for a cool $6.5 billion , One Identity buying rival OneLogin , and countless venture capital (VC) investments are thrown into the identity management space. IAM, for the uninitiated, is chiefly concerned with authenticating and authorizing people, and controlling how, where, and when they can access specific systems and applications. At a time when every company is effectively a software company , managing user permissions becomes integral. Different users will often require different access rights based on their role and department, and companies need the infrastructure that enables their software to do this without having to create it all from scratch. For example, financial software might need to offer user permission functionality, so some employees can only submit expense reports, while others will be able to “approve” the expenses or mark them as “paid.” These various permissions might vary by team, department, and geographic location — and companies need to be able to set their own user permission rules. This essentially is where Cerbos enters the mix — it is the “AM” in “IAM,” allowing developers to implement access management in their own applications without having to reinvent the wheel. “We don’t try to handle the ‘I’ part, because it’s practically a solved problem,” Cerbos cofounder and CEO Emre Baran told VentureBeat. Above: Where Cerbos sits in the stack Cerbos would typically be used in tandem with one of the many identity authentication solutions out there, such as Google’s Firebase , Microsoft’s Active Directory (AD), Auth0 , and WorkOS. The step that follows authentication — authorizing identity and applying specific permissions — also has options, such as Open Policy Agent , Casbin , and CanCanCan , but these are somewhat “more limited,” according to Baran. “There are many libraries and frameworks that developers can take, enhance, and build into their product for authorization,” he said. “However, they are all focused on specific programming languages or frameworks and usually implement authorization for a single, monolithic application and don’t cater for the business users to define permissions in a human-readable way.” This is particularly important as companies move away from monoliths toward microservices — that is, software built from smaller, function-based components. “Being able to share your authorization logic across multiple different services — usually developed by different teams and potentially in different programming languages — and instantly update that logic across the board, without having to redeploy all of those services, is very powerful,” Baran added. “That’s what Cerbos provides.” Baran is an ex-Googler who went on to found an ecommerce personalization technology company called Qubit , which was acquired by Coveo just last month. He launched Cerbos back in March alongside software engineer Charith Ellawala, who previously worked at various tech companies such as Ocado, Qubit, and Elastic. It was at Qubit where the duo encountered the problem that they are now trying to fix with Cerbos — every time a company builds a new piece of software, engineers have to develop the user permissions infrastructure from scratch. “This is particularly true in large enterprises, where different departments or teams need to use the same software platform for distinctly different functions,” Baran explained. “It is a time-consuming and cost-inefficient way of working. We’re enabling companies to be more compliant, and making higher quality security available to every developer.” Open for business That Cerbos is open source will likely be central to its appeal, particularly at a time when companies need to treat their users’ data with kid gloves to cater to a growing array of privacy regulations. Being open source allows companies to inspect their source code and contribute new code themselves, while as a self-hosted solution it means that they don’t have to transfer data to third-party infrastructure. Visibility and auditability is the name of the game here. “You know exactly what you are running in your system, and how it handles your data,” Baran said. “You also get to benefit from the community — the product is constantly improved and tested by people who are passionate about the problem. And even if the company [i.e. Cerbos] discontinues working on the product, you still have access to the source code and can continue to make use of it and improve it if it’s critical to your business.” Much like companies usually don’t build their own databases from scratch, choosing an off-the-shelf solution instead, Baran sees Cerbos fulfilling a similar role for user permissions — and so its target customer size is really anything from small startups to billion-dollar companies. However, it’s worth noting that user permission requirements tend to get more complex the bigger a company gets, which positions Cerbos strongly for the enterprise segment. “One thing they all have in common is that they all recognize that building permissions’ software is not their core business, and they would rather implement an off-the-shelf, state-of-the-art solution than build it themselves,” Baran said. “We believe in a world where time isn’t wasted re-inventing the wheel — in that world, our mission is to make authorization a trusted ‘plug-and-play’ solution.” For now, Cerbos is available in a pure open source incarnation, allowing any developer to leverage as they see fit. However, the company is also working on various premium offerings, which will include a fully-managed version replete with a graphical user interface (GUI) for managing permissions and roles. Additionally, Cerbos will offer tools for auditing, monitoring, and analysis, alongside features for chief information and security officers such as “predictive unauthorized access prevention” smarts. Cerbos’s two founders are based in London, though as with most young startups these days, the company has adopted a globally distributed approach to its hiring, with seven employees spread across the U.K., New Zealand, Turkey, and Spain. In addition, to lead backer Crane, Cerbos attracted a slew of institutional investors for its seed round of funding, including OSS Capital, Seedcamp, Earlybird Digital East, 8-Bit Capital, Connect Ventures, Acequia Capital, HelloWorld, Tiny, and a host of angel investors. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,635
2,021
"Trustpage lands $5M to boost security transparency for SaaS vendors | VentureBeat"
"https://venturebeat.com/software/trustpage-lands-5m-to-boost-security-transparency-for-saas-vendors"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Trustpage lands $5M to boost security transparency for SaaS vendors Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Trustpage , which enables software-as-a-service (SaaS) vendors to easily communicate their data security and privacy measures, today announced a $5 million round of seed funding to expand its team and ramp up its growth in the market. Founder and CEO Chase Lee said the company’s AI-powered platform allows business-to-business (B2B) SaaS companies to proactively address the types of questions that are increasingly being asked around handling data for privacy and security reasons. Companies use Trustpage to create a page — that’s added to their website — that covers all the steps they’ve taken to secure user data, prevent cyberattacks, and meet compliance requirements. B2B SaaS companies are where data security and privacy issues are “most acute—because 90% of our data passes through these companies now,” Lee said in an interview with VentureBeat. “So when we think about how we can put a dent in cybercrime and things like privacy infringement, it really is with these companies—because they’re the ones who handle the most data,” he said. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! A nutrition label for data security According to Cybersecurity Ventures, the total cost of cybercrime by the end of 2021 is expected to reach $6 trillion. And this figure is expected to surge to $10.5 trillion by 2025. And yet, transparency about data privacy and security is often lacking, Lee explained. The process of exchanging information about handling user data remains manual in many cases — with companies often conducting a security review by trading emails or Google Docs. For companies that do disclose data security and privacy information on their website, it’s often a DIY page that is static and difficult to update — and doesn’t offer to facilitate further information sharing, Lee said. Trustpage addresses these issues with its platform, which allows for easier creation and updating of a searchable data security disclosure page, he said. The page can provide a secure channel for collaborative sharing between software buyers and sellers, as well. The platform also offers AI capabilities that recommend answers to questions, recognize when content needs to be updated, and suggests content updates. Additionally, Trustpage integrates into a customer’s CRM, so they can see when their security documents are being reviewed. The page that is created ultimately serves as something akin to a nutrition label for data security, privacy, compliance, reliability, and threat management, according to Trustpage. “We want to help companies lean into security and trust,” Lee said. “And the hope is that if we do that with enough people, that will really start to raise the bar for security across the industry.” Major benefits Bonfire Ventures, a seed-stage investment firm, co-led the round for Trustpage. Brett Queener, a partner at Bonfire Ventures, explained that what the startup is offering is sorely needed in the industry to meet the heightened expectations for data security and privacy. Along with largely eliminating a manual process of performing security reviews, providing transparency around data privacy and security measures can help to accelerate sales cycles, he said. Queener, formerly an executive vice president at Salesforce, said he’s spoken with numerous chief information security officers (CISOs) about Trustpage. And across the board, he detailed that the reaction has been that if Trustpage “does what it purports to do, and is not outrageously expensive, there’s no reason I wouldn’t buy this.” Trustpage, which became generally available in June, is also the only solution on the market that is “singularly focused” on enabling communication and collaboration around data security and privacy information, Queener said. “No one is doing what Trustpage is doing today.” Customer traction Trustpage currently has about 200 paying customers, Lee said. One of its customers is Dutchie, a provider of software for ecommerce and point of sale at cannabis dispensaries, which has been using Trustpage since June. “Trustpage is key for us because it’s allowing us to establish how we approach security and privacy and compliance very early on in the customer relationship,” said Chris Ostrowski, Dutchie’s chief technology officer. The platform has also completely replaced the manual process of providing potential customers with data security and privacy details, Ostrowski said. “Since we set up our Trustpage, those requests are down to zero,” he said. Ostrowski said he foresees a similar situation happening with trust pages that have occurred with status pages among technology companies. If a technology company “doesn’t have a status page, that calls into question their reliability,” he said. “The same is going to apply to security and trust.” Statuspage, a startup whose platform enabled customers to communicate about their service status, was acquired by Atlassian in 2016. Prior to Trustpage, Lee had cofounded note-taking app Fetchnotes (acquired by Drift in 2015) and referral marketing platform Ambassador Software (acquired by West Corporation, now Intrado, in 2018), where he served as chief technology officer from 2013 to 2020. Growth trajectory Lee founded Trustpage in April 2020, and it had been a self-funded operation up until this $5 million seed round, he said. Along with Bonfire Ventures, Ludlow Ventures and Detroit Venture Partners co-led the funding round. Other investors in the round are Entrée Capital, Basement Fund, and GTMfund. Detroit, Michigan-based Trustpage currently has 14 employees, and expects to increase its headcount to 20 by the end of the first quarter of 2022, with new hires planned in engineering and marketing, Lee said. Along with building out the team, the funding round will help to fund further maturing of the product, as well as bringing it to a wider number of customers, he said. “This year was really about laying a foundation and getting this from a proof-of-concept to a real business,” Lee said. “Next year is really going to be about driving awareness to what we’re doing and helping reach a lot more people.” Looking ahead, Trustpage also aspires to “build an entire community around what we’re doing,” he said. Lee said that Trustpage’s AI-powered security scanner has mapped about 6,000 companies and begun to build a directory of their security postures, based on what they’ve shared about their data security and privacy measures. The aim is to build a directory for security that’s similar to what Glassdoor has done for employee reviews. In the wake of software supply chain attacks, such as SolarWinds , this directory has the potential to help companies with securing their supply chains, Lee explained, “as well as just creating more transparency for the software industry.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,636
2,021
"The surprising health care security vulnerabilities that need shoring up | VentureBeat"
"https://venturebeat.com/security/the-surprising-health-care-security-vulnerabilities-that-need-shoring-up"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored The surprising health care security vulnerabilities that need shoring up Share on Facebook Share on X Share on LinkedIn Presented by Armis Over the last five years, health care institutions have undergone a digital transformation. They’ve gone from manual, internal processes for sharing information to integrated workflows with electronic health records, bringing medical devices and information online and into the health IT ecosystem. Those application suites have improved everything from communication to the quality of care, and collaboration between clinical teams. “On the flip side, the digital transformation has also made IT departments in these organizations acutely aware of the brand-new vulnerabilities that these solutions incur — and the upstream/downstream effect that security breaches can have on the organization,” says Sumit Sehgal, strategic product marketing director at Armis, a unified asset visibility and security organization. The new ecosystem vulnerabilities From blood glucose monitors to medical imaging devices and pacemakers, there are already 430 million connected medical devices worldwide , spread across departments and use cases. And similar devices are connected in different ways. For example, a hospital may deploy four different types of pacemakers to keep a patient’s heart rhythm constant, but depending on the age and the makeup of the devices, how they communicate and how they interact with IT infrastructure, can be completely different. The same can be said for all of the CT scanners, MRI machines, and pharmacy systems in use. “When we talk about connected devices, it’s not so much just understanding what the device is,” Sehgal says. “It’s about understanding how that device is functioning in context for that health system that is providing care.” These devices have added an entirely new layer of complexity for IT departments to handle, from patch levels to security configurations to updates, as well as how they’re communicating across health care systems, and how they’re connected to other sensitive networks. “That’s probably the biggest set of vulnerabilities that health care organizations have to deal with,” Sehgal says. “That’s on top of the work they already do for all the other IT infrastructure as well.” For example, infrastructure ecosystem vulnerabilities extend to the environmental controls that are critical to medical care, such as the building management systems and everything from potable water delivery to pressure control sensors and elevator control mechanisms. “The second level of the vulnerability assessment process comes with the realization that there’s more than just medical devices to deal with,” Sehgal says. “It’s a big challenge for health care institutions, and they have limited expertise in how to deal with it.” The impact of healthcare ecosystem vulnerabilities “It’s not only just about the impact of what happens when these vulnerabilities are exploited. It’s also tied to the resilience of the health systems and how effective they are in trying to navigate these situations that dictate the outcome,” Sehgal says. From an IT risk management perspective, the most obvious impact of this digitization of health care is the risk of data breaches. Depending on how information is flowing in and out of these devices, and how they’re integrated into the hospital network, the risks range from leakage to unauthorized access of confidential, protected health information. A Ponemon Institute study found that 54% of health care providers experienced at least one patient data breach within the past two years, while 41% encountered six or more. Each of these breaches can expose 10K patient records and cost up to $2.75 million on average. Less commonly considered is the impact of digital technology breakdowns. Clinicians increasingly rely on these systems to make treatment decisions, and breakdowns directly affect their ability to deliver care. These failures can also disrupt the continuity of operations, delaying the process of care and delivery of medication, and often incurring financial overhead because alternate workflows aren’t nearly as efficient. Sehgal notes that while the impact on patient safety is often talked about, the actual effects on it are rare because they are protected by layers of workflows — the real pain is in needing to rejigger workflows, cost issues, and delays in care. Securing care delivery Right from the start, every organization that’s having a conversation on medical device security needs to understand that this is not an IT-only or clinical-only undertaking, Sehgal says. Security requires a governance process that pulls clinical teams together with biomedical, IT, and information security, for conversations around what success looks like for your own health system. Secondly, you need to retool your security team training and ensure that partners helping you with security response or security strategy have the appropriate capability. But most importantly, you need to focus on more than just medical devices. “Medical devices are the start of it, but that’s not everything that touches your patient,” Sehgal explains. “From point of admission to point of discharge, your information can be passed through about 30 applications and touched by a couple of hundred people.” He urges organizations to look at security approaches such as vulnerability management, including asset management, and threat modeling to calculate the risks based on the vulnerabilities and probability of impact tied to clinical area and patient flow. “Given the issues with money, time, and priorities that organizations have today, what they can do is to take what processes they already have and try to apply a more real-time approach to vulnerability management and modeling,” Sehgal explains. “Whether they’re dealing with things like a building management system or a CT scanner, everything is tied to an outcome in relation to how they deliver care to a patient.” Organizations can also focus on their tabletop drills — going through theoretical scenarios and developing responses. Tabletop drills are important because they illustrate potential choke points or potential single points of failure in a response process. What they don’t do is actually give you the muscle memory of what it takes to respond to a scenario in real-time. For example, a tabletop drill might suggest replacing a compromised asset in four minutes — but that’s not accounting for the real world, in which it might take ten minutes just to find an IT staff member, and then there’s an additional ten-minute walk for that individual to get to the compromised asset. That just added 20 minutes per device times 400 devices affected, which suddenly creates a massive, unaccounted-for problem. Health care institutions that are doing security right are collaborating with the security operations team to develop their emergency management functions, bringing together security response and operations continuity. “Leveraging solutions in this space, like those that Armis provides, helps health care organizations identify very quickly where their weak points are, and they can then take that information and map it to how they deliver care,” Sehgal explains. Setting priorities for risk management Sehgal notes that security solutions are very good at letting people know what they have and what’s wrong with it, but don’t automatically know what’s important. Which specialties are important from a revenue perspective for a health system? Which ER workflow is more important? “Every health system has their risk tolerance with regard to what they can live with and without in the context of how they provide care,” Sehgal says. “That’s not an IT solution problem. That’s a process problem that needs to be addressed.” To help manage priorities, an organization can identify a manageable scope, isolating and focusing on a specific area, a specific specialty, or a specific project. That helps them adjust to the new reality of securing more than just IT assets, from a process perspective and a workforce training perspective as well. From there, they can have the conversation inside the appropriate clinical and enterprise risk management teams in the hospital environment, to have what Sehgal calls “an introspectively honest conversation” about the capacity of the organization in terms of dealing with incidents. “They should ask what are the thresholds for downtime and data loss we as an organization deal with and for how long,” says Sehgal. “Plus, which partners can they work with to make sure they have the help they need when things go awry.” Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected]. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,637
2,021
"Mainframe Industries raises $22.9M for cloud-native games | VentureBeat"
"https://venturebeat.com/pc-gaming/mainframe-industries-raises-22-9m-for-cloud-native-games"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Mainframe Industries raises $22.9M for cloud-native games Share on Facebook Share on X Share on LinkedIn Mainframe Industries is making a cloud-native MMO. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Mainframe Industries has raised $22.9 million to develop cloud-native games in a financing led by Andreessen Horowitz and a new set of investors. The Finnish-Icelandic gaming studio founded by former CCP, Remedy Entertainment, and Next Games veterans is developing an unannounced cloud-native massively multiplayer online game that is designed and built at Mainframe ’s Helsinki and Reykjavik studios. Andreesen Horowitiz (a16z) previously led an $8.3 million round for Mainframe in 2020. In cloud gaming, a game is computed in internet-connected datacenters, and then the results are sent via video to a user’s screen. That enables high-end games to tap the vast computing and memory systems in the cloud and then display the results on a user’s machine, whether that’s a low-end laptop, a desktop gaming PC, a mobile device, or a console display such as a TV. A cloud-native game lets the company develop a single runtime in the cloud that gives the player access to faster and greater amounts of storage, resulting in a more vibrant and detailed world, according to CEO Thor Gunnarsson. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Above: Mainframe’s BlueLands. “Andreessen Horowitz has opted to double down and continue to support us, which of course is amazing,” said Gunnarsson in an interview with GamesBeat. “As a European and Nordic game startup, this is big news over on this side of the pond. We’ve been working hard to build this team, and it is fantastic to see the co-investors who have joined this round.” Cloud gaming also allows for seamless crossplay between mobile, PC and console, allowing players to access their game from any device and play against anyone on any machine. Streaming games via the cloud is a lot like what Netflix or Spotify did for film and music. More details about the game will be revealed later, Gunnarsson said. It is an open world, sandbox MMO, and the aim is to build out a persistent online world accessible from any screen. In addition to Gunnarsson, the Mainframe Industries founders are Börkur Eiríksson, Kjartan Pierre Emilsson, Fridrik Haraldsson, Reynir Hardarson, Sulka Haro, Kristján Valur Jónsson, Jyrki, Korpi-Anttila, Saku Lehtinen, Ansu Lönnberg, Eetu Martola, Vigfús Ómarsson, and Jón Helgi Thórarinsson. Above: Deer in Mainframe’s forest. “We’re having a blast,” Gunnarsson said. “It is a lot of fun, a lot of work. It’s been an interesting period to build out a team across Europe in the middle of these distributed times that we’re living in, but we’re coming out the other end even better than I had hoped as we got started. And we’re not too far off from being able to share some more details on our project and about the great group of talent that has been joining the company in the past 18 months. It’s been really encouraging.” New investors is this round include Twitch cofounder Kevin Lin; Anton Gauffin, CEO of Huuuge Games; Cédric Maréchal, former senior vice president of international at Activision Blizzard; and Taavet+Sten (founded by entrepreneurs Taavet Hinrikus and Sten Tamkivi). Andreessen Horowitz’s Jonathan Lai is joining Mainframe’s board. The team size has more than doubled in the past year to more than 55 people, with key hires from Amazon Web Services, Blizzard, and Ubisoft. The company has 11 nationalities on board its staff. By the end of the year, Gunnarsson expects the staff will hit 60. Above: The evening in Mainframe’s wilderness. The company also has a new studio in Paris. Finland’s government also supported the company with a research-and-development loan early on. “The public support program is a great example of the financing that European startups can take advantage of,” Gunnarsson said. The company is using Epic’s Unreal Engine to develop its first game, and Gunnarsson said the team is making the transition to Unreal Engine 5, which is expected to be formally released next year. “We’re looking to push as aggressively as we can in terms of visual features and taking advantage of some of the new functionality that they bring with Unreal Engine 5,” Gunnarsson said. “We’re very focused on creating an MMO that is playable across mobile, PC, and console. The game will support both active players as well as spectators who can support their friends in the game. “That’s some of the innovation that we are hoping to drive in the industry,” Gunnarsson said. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,638
2,021
"Activision drops new details on Warzone Pacific map Caldera debuting December 8 | VentureBeat"
"https://venturebeat.com/pc-gaming/activision-drops-new-details-on-warzone-pacific-map-caldera-debuting-december-8"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Activision drops new details on Warzone Pacific map Caldera debuting December 8 Share on Facebook Share on X Share on LinkedIn Warzone's Caldera map is coming on December 8. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Call of Duty: Warzone is getting its first really new map on December 8 in the form of Caldera , the Warzone Pacific map set in Call of Duty: Vanguard. It’s the first major map change since the debut of Warzone in March 2020, when the pandemic-driven lockdowns drove audience levels for the free-to-play battle royale to more than 100 million. Of course, this is the second map change if you count the redo of the Verdansk map in April 2021 to an earlier time period (it was still pretty similar to the original map). The map will figure a new dedicated playlist (of new scenario types) with new vehicles and weapons from Call of Duty: Vanguard. The huge map has 15 distinct areas to explore and fight across. In order for Vanguard players to access Warzone Pacific 24 hours early, they need to have played at least one Vanguard multiplayer game. Above: Warzone’s Caldera map has 15 sections. As part of the Season One update, Ricochet Anti-Cheat will deploy a new, internally developed kernel-level driver on PC to assist in identifying cheaters in Warzone. This kernel-level driver is coming first to Warzone and will be required for all PC players as of this update. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Caldera and Warzone Pacific are just the start of the free content to be offered in Season One of Call of Duty: Vanguard and Warzone. More info is coming on two free functional weapons, new Operators, Zombies content, multiplayer modes, and two new multiplayer maps all launching on December 8. Above: Caldera’s views of the ocean. Later in the seasons, fans can expect two more Operators, an additional core Multiplayer map and returning small-team tactical mode, additional Zombies info, another free weapon unlocked via challenges, and a festive celebration across both games. The map has a series of locations including arsenal, docks, runway, ruins, mines, peak, beachhead, village, power plant, capital, resort, sub pen, lagoon, airfield, and fields. The map has hundreds of lesser points of interest as well. In terms of competition, Battlefield 2042 is already out, and Halo: Infinite debuts on December 8. Activision said Vanguard Royale has a few differences compared to a traditional battle royale. Players can fly fighter planes that rain down fire on foes or head for the anti-aircraft guns or trucks to shoot them out of the skies. Other ground-based vehicles, including a squad transport all-terrain car, will help your squad cut through and around the island. One of the big changes is that it will focus on World War II weapons, not modern weapons. Loadouts can only contain Vanguard weaponry, streamlining the meta for more accessibility and room for experimentation. Expect all weapons around the island to also be from Vanguard. Operator selection is also limited to Vanguard soldiers. So that gets rid of anachronistic gameplay anomalies. The radii and times for circle collapses will be altered to offer a different pace of play, especially with dogfighting overhead. Also, expect an in-game event at each circle collapse, bombing runs, and different items both within Caldera’s overworld and contract rewards. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,639
2,021
"Atomera breathes new life into Moore's law with better power efficiency | VentureBeat"
"https://venturebeat.com/mobile/atomera-breathes-new-life-into-moores-law-with-better-power-efficiency"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Atomera breathes new life into Moore’s law with better power efficiency Share on Facebook Share on X Share on LinkedIn Atomera has created a quantum film layer that helps enhance transistors. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Chip startup Atomera said it can breathe new life into Moore’s law with better power efficiency for analog chips. Los Gatos, California-based Atomera said its Mears Silicon Technology Smart Profile — a quantum-engineered film based on 15 years of research and development — is available for licensing to use in 5-volt power and analog electronics. It could enable certain kinds of electronics to be produced with 20% more working parts per batch of chips produced, and that could yield significant improvements in costs. The MST Smart Profile is an integration of the films that go with a specific asymmetric transistor design with the details (dopant implants) optimized to create a five-volt power device. Through a combination of atomic level engineering and advanced material science, publicly traded Atomera is squeezing more capability and capacity out of today’s semiconductor processes. The resulting improvements in power, performance, and area (PPA) — the standard measure of Moore’s law — are effectively enabling the industry to get smaller die size using the same process node. Moore’s law, first talked about by Intel chairman emeritus Gordon Moore in 1965, became the metronome of the chip industry. It predicted that the number of components on a chip would double every couple of years. And until recently, it held up remarkably well. But as the physical dimensions of chip miniaturization get smaller and approach the level of individual atoms, it’s hard to make advances. And so many have begun to predict a slowing down of Moore’s law and an accompanying slowdown of technological innovation. That’s not good for the economies of the world. Jeff Lewis, vice president of business development and marketing at Atomera, said in an interview with VentureBeat that half of the world’s semiconductor manufacturers are engaged in conversations with Atomera. And Atomera has collected more than 300 patents related to the tech over the years. Lewis said that the technology works for 5-volt transistor products. Above: Atomera is making chips more power efficient. “It gives a substantial improvement in performance and area to help you get about 20% or more chips out on a wafer than you could in the past,” Lewis said. “Transistors are omnipresent. They’re in almost everything we have out there. And in fact, we just saw an article that DDR5 chips can’t ship anymore because of shortage of the power-management devices that go with it.” Lewis said the creation of the new technology could help address the backlog and supply shortage problems in the industry. He said chipmakers that adopt the tech could get more value out of existing chip factories. “It’s really a very broadly applicable technology. And what it really does is enhance transistors,” Lewis said. “You’ll get more drive current out of a transistor, and you can use that in a variety of ways.” While digital chip technologies have benefitted greatly from Moore’s law, there is a significant market for bipolar CMOS-DMOS (BCD) semiconductors that are built today in legacy nodes ranging from 40 nanometers (nm) to 180nm. BCD is a family of silicon processes, each of which combines the strengths of three different process technologies onto a single chip: Bipolar for precise analog functions, CMOS (Complementary Metal Oxide Semiconductor) for digital design and DMOS (Double Diffused Metal Oxide Semiconductor) for power and high-voltage elements. ST Microelectronics says this combination of technologies brings many advantages: improved reliability, reduced electromagnetic interference and smaller chip area. BCD has been widely adopted and continuously improved to address a broad range of products and applications in the fields of power management, analog data acquisition and power actuators. According to The McClean Report from IC Insights, the major user of BCD processes is the power management integrated circuits (PMICs) sector, which had a market size of $14.6 billion in 2020 and is forecast to grow to $24.9 billion in 2025. All these chips are like the condiments that go with your hamburger patty, and they’re quite essential to the functioning of any electronic device, said Lewis said. PMICs are used in nearly all electronic devices. PMICs are required for any device that is powered by a battery or USB connector, and they are used to scale line voltage (110V or 220V) down to semiconductor voltages of 0.9V – 5V. PMICs are now appearing in unexpected applications – they are now included on DDR5 modules rather than on the motherboard – and a shortage of these PMICs is impacting DDR5 shipments. The growth can be attributed to the projected increase in mobile and other devices that use sophisticated power management techniques. For example, according to Hui He, an analyst at research firm Omdia, as reported by the Wall Street Journal, “a typical 5G smartphone can hold as many as eight power-management chips, compared with two to three in a 4G phone.” “I have worked in the analog and power device sector for a long time and have witnessed firsthand the challenges to scaling these devices compared to digital,” said Lou Hutter, principal at Lou Hutter Consulting, in a statement. “Combining this ‘scaling gap’ with the increasing prevalence of these devices is certainly one of the factors behind industry shortages we see in these devices. Atomera’s MST-SP technology can significantly shrink the power transistors that routinely occupy 40% to 80% of the area in a PMIC, which enables manufacturers to get 20% more die per wafer — and with lower power consumption to boot.” BCD technologies face more difficult scaling challenges than their digital counterparts and as a result have not seen the process node advances that digital chips have. While some market leaders have introduced advanced BCD processes at the 40nm nodes, most BCD devices are produced at older-generation process nodes. MST-SP allows BCD PMIC manufacturers to get up to 20% more die per wafer, enabling manufacturers to improve the profitability of existing fabs and/or improve the return on their investments in new processes and capacity. Atomera’s chief technology officer Robert Mears said in an interview that power-related chips are proliferating in everything from battery-operated devices to those with universal serial bus (USB) connectors. And keeping the costs manageable while hitting performance and power targets isn’t easy with the current state of affairs. Above: There are a lot of power chips in iPhone 13 models. “The difference here is you’ve got one technology that improves both flavors of transistor,” Mears said. “And it’s also complementary and additive. It has an inherent mobility benefit, depending a little bit on which node you’re looking at.” The quantum layer of film gives chip creators a unique ability to control the doping, or dropping of chemicals in a particular spot on the chip, across a wide range of applications, Mears said. The challenge limiting the ability to transition BCD power devices to smaller manufacturing nodes has been sufficiently improving the on-resistance for a given breakdown voltage while ensuring reliability isn’t compromised. It is easy to lower the resistance of a device – but it will burn-out at a lower voltage. There is a tradeoff between on-resistance and ability to withstand higher voltages. Typically one targets an operating voltage and then tries to lower the on resistance just to the point where it operates at the voltage but not higher. MST-SP provides two fundamental benefits. IDlin is the current a transistor passes in the linear regime which is for a low voltage between source and drain. One is on-state mobility, or higher Idlin, and the second is an ability to control the doping with a degree of precision that is not possible with other approaches. The doping benefits also translate into improvements in the breakdown voltage as well as the overall ability to scale the gate length lower without losing breakdown voltage. The net effect is a 20% improvement in Idlin for a given VDS max, a key reliability parameter for the lifetime of the device. (VDS is the source-drain voltage across the transistor). The ability to scale the gate length while maintaining reliability also addresses the key challenge in moving BCD power devices to smaller nodes. By enabling a gate length shrink without compromising the reliability of the power device, manufacturers can take better advantage of a design rule shrink to reduce the overall device pitch. For 5V PMIC, this enables up to 20% more dies per wafer. Above: The benefits of Atomera. MST-SP is available to license today for 5V power devices, which is the predominant operating voltage for BCD PMIC devices today. Atomera’s engineering team can customize MST-SP to both higher and lower voltages. Before the company went public, it raised $83.4 million. And as a public company, it raised $80 million. The solution is complementary and additive compared to other efforts chip makers are making, Lewis said. And in contrast to past licensors like Rambus, which got into legal troubles with chipmakers over licensing, Lewis said that the chip industry is free to license the technology or not. Regarding performance, Mears said, “We haven’t seen anybody coming close to these numbers, as well as the smallest area and cost across different processes.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,640
2,021
"Complexity Stars blends celebrities and esports athletes to engage gamers | VentureBeat"
"https://venturebeat.com/esports/complexity-stars-blends-celebrities-and-esports-athletes-to-engage-gamers"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Complexity Stars blends celebrities and esports athletes to engage gamers Share on Facebook Share on X Share on LinkedIn Complexity Stars blends esports stars and celebrities. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Complexity and its GameSquare Esports parent firm are revealing Complexity Stars today as an initiative that will blend celebrities and esports pros to engage with gamers. The initiative will blend the pro athletes from a leading esports organization with celebrities from other professions in a bid to draw more attention to esports content and engage gaming fans. It brings together the buzz around gaming, esports, professional athletes, and celebrities. It could pay off in getting a lot more views for the content they stars are making, and that could generate more revenue. Complexity Stars will debut with a roster of star athletes from the NBA, WNBA, UFC, NFL, and MLB. “Ever since our GameSquare acquisition back in July, Complexity Stars was something that was kind of top of mind for how we find ourselves in this really unique intersection with esports and traditional sports,” said Kyle Bautista, chief operating officer of Complexity, in an interview with GamesBeat. “The question was how we bring these things together and use our ownership and relationship with Jerry Jones in the association with the Dallas Cowboys.” Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Toronto, Canada-based GameSquare Esports is a publicly traded esports and creator company. In July, GameSquare bought majority control of Complexity, which was started by Jason Lake and owned by Dallas Cowboys owner Jerry Jones and real estate mogul John Goff. Complexity will curate customized strategies for Complexity Stars athletes to amplify their brand in the gaming community and engage the next generation of sports fans through tournaments, where Complexity will organize crossover competitions, esports tournaments, and other activations to create impactful brand integration opportunities. It will also focus on content creation so athletes can easily livestream and create compelling content distributed across social channels to connect with gaming and sports fans. And Complexity will connect sponsors, create branded content and develop additional opportunities for brands to connect with fans. Above: Complexity Gaming’s headquarters. Complexity Stars’ diverse roster of athletes includes former NBA guard J.R. Smith; Olympic gold medalist and WNBA Dallas Wings shooting guard Allisha Gray; UFC fighters Sean O’Malley, Max Holloway, and Megan Anderson; Tampa Bay Buccaneers running back Leonard Fournette and Baltimore Ravens offensive tackle Ronnie Stanley; and Los Angeles Dodgers infielder Edwin Rios. “Complexity Stars is an incredible gaming division where sports, entertainment, and gaming come together,” said Duane “Zeno” Jackson, GameSquare head of talent and special projects, in a statement. “There are amazing opportunities to collaborate with global talent through compelling content and tournaments that appeal to brands seeking to connect with the large, affluent, and growing gaming audience.” Complexity Stars is supported by the GameSquare family of companies which bring deep expertise in digital media, marketing, and influencer representation in the United States and Europe. Complexity Stars is co-located with the Dallas Cowboys, and so the thinking was to leverage the relationship more. Bautista said that many of the traditional athletes are video game fans who go back home at night to play their favorite games. And so Complexity Stars brings those passionate gamers on both the sports and esports fronts together. “These groups also share the same values as Complexity, as they want to use their platforms to do something more, give back to the community, and serve those who are underserved,” Bautista said. “So we came up with this really diverse list of athletes from all different walks of life, both men and women. It has been a really exciting project for us being able to talk with them about their passions.” “Monetization is definitely something that we are looking at. I think we see it as more on the events side. I think that being able to utilize these talent in pro/am events is something that we’re really excited about,” Bautista said. “These are superstar athletes and celebrities that all have their own individual following and fans. So being able to create IP utilizing their brands in combination with Complexity Stars gives us great options. We’re going to be looking to monetize them a lot on the event side of things, but they are celebrities and influencers in their own right as well.” Complexity has more than 25 people working for it and more than 60 pro esports athletes. Complexity Stars will have some kickoff events soon that help galvanize Complexity’s 30 million fans. Physical esports are starting to trickle back, Bautista said. Bautista noted that during COVID there was a sharp downturn for physical esports and a big gain for online esports events, but he said he hopes that physical esports will return in the new year. “We’ve had a great year, as we picked up a Rocket League team in North America that qualified for the Rocket League major event in Stockholm. Our Apex teams continue to do well, as are our World of Warcraft and Valorant teams,” he said. “It’s been exciting.” GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,641
2,021
"Wonder launches online networking platform where people can meet | VentureBeat"
"https://venturebeat.com/entrepreneur/wonder-launches-online-networking-platform-where-people-can-meet"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Wonder launches online networking platform where people can meet Share on Facebook Share on X Share on LinkedIn Wonder is going after online events. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Virtual networking startup Wonder announced today the launch of the next iteration of their video conferencing software. Berlin-based Wonder , previously known as YoTribe, has created a virtual space where groups can get together to create connections in a way that the company says feels natural and energizing. The company raised $11 million a year ago. Guests move around the space freely, choosing their conversation partners and actively shaping the event. This makes space for serendipity and fosters interactions and connections between participants. It’s another kind of replacement for our Zoom calls, and it’s attacking the problem of tech fatigue during the pandemic. Though virtual events and video conferencing have become the new standards, according to a new survey, participants don’t feel like they are truly “connecting,” according to the company’s poll. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! With 84% of respondents admitting to not paying full attention to video conferences, it’s no surprise so many of us feel drained and detached when on a video call, Wonder said. However, events on Wonder generate energy and real serendipitous human connection by leveraging simple, yet sophisticated tools and a natural need to interact with others, the company said. “Social spaces are at the core of our lives,” said Wonder co-founder, Stephane Roux, in a statement. “Our desk at the office and the connections we made there, the hotel bar where we met a slew of new people at that last great conference, the courtyard in college where we met some of our now oldest friends. Those spaces are filled with meaning because of the shared experiences we made with others there and we want to build those spaces online.” Above: Wonder lets you stage online events where people can network. Wonder is a browser-based 2D virtual space in which users move their avatars around freely. When they get close to other avatars, a video chat opens. The spaces contain up to 500 participants and users can move between groups to meet others and experience the event together. Hosts can customize this space with backgrounds such as a corporate logo, solid color, abstract image, their own uploaded image, and more. Within that “space”, hosts can build “areas” which give structure to the space and can represent different themes within an event or conference, such as “Breakout Session” or “Meet and Mingle.” More than 75% of Wonder’s current users, including organizations as diverse as NASA, Deloitte, Harvard, Vodafone and Paypal, said the freedom of movement and ability to initiate conversations with others offers flexibility and a more engaged audience. “When COVID hit, we all did the best we could with what we had: conference calls became video calls, etc.” said Roux. “But in the virtual space we don’t have to do things as we always have. We can reimagine how we connect and interact, and rather than trying to recreate the physical world, we can create a new, dynamic experience for our new normal.” Wonder is currently free and available on most web browsers. It has had more than 3.8 million participants and 200,000 hosts to date. Wonder has 45 employees and it has raised $12 million to date. Regarding competition, Wonder said, “We think that the virtual events market is still in its early stages. The reason people get together is to build connections with others and to meet new people, and there still isn’t really any products out there where this feels natural and energizing. Wonder is designed for human interaction at its core. We’re less focused on event logistics and more on facilitating connections and conversations between people (we actually routinely have talks with companies like Hopin on how to integrate and partner).” GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,642
2,021
"ML-driven tech is the next breakthrough for advances in biology | VentureBeat"
"https://venturebeat.com/datadecisionmakers/ml-driven-tech-is-the-next-breakthrough-for-advances-in-digitized-biology"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Community ML-driven tech is the next breakthrough for advances in biology Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. This article was contributed by Luis Voloch, cofounder and chief technology officer at Immunai Digital biology is in the same stage (early, exciting, and transformative) of development as the internet was back in the 90s. At the time, the concept of IP addresses was new, and being “tech-savvy” meant you knew how to use the internet. Fast-forward three decades, and today we enjoy industrialized communication on the internet without having to know anything about how it works. The internet has a mature infrastructure that the entire world benefits from. We need to bring similar industrialization to biology. Fully tapping into its potential will help us fight devastating diseases like cancer. A16z has rephrased its famous motto of “Software is eating the world” to “Biology is eating the world.” Biology is not just a science; it’s also becoming an engineering discipline. We are getting closer to being able to ‘program biology’ for diagnostic and treatment purposes. Integrating advanced technology like machine learning into fields such as drug discovery will make it possible to accelerate the process of digitized biology. However, to get there, there are large challenges to overcome. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Digitized biology: Swimming in oceans of data Not so long after gigabytes of biological data was considered a lot, we expect the biological data generated over the coming years to be counted in exabytes. Working with data at these scales is a massive challenge. To face this challenge, the industry has to develop and adopt modern data management and processing practices. The biotech industry does not yet have a mature culture of data management. Results of experiments are gathered and stored in different locations, in a variety of messy formats. This is a significant obstacle to preparing the data for machine learning training and doing analyses quickly. It can take months to prepare digitized data and biological datasets for analysis. Advancing biological data management practices will also require standards for describing digitized biology and biological data, similar to our standards for communication protocols. Indexing datasets in central data stores and following data management practices that have become mainstream in the software industry will make it much easier to prepare and use datasets at the scale we collectively need. For this to happen, biopharma companies will need C-suite support and widespread cultural and operational changes. Welcome to the world of simulation It can cost millions of dollars to run a single biological experiment. Costs of this magnitude make it prohibitive to run experiments at the scale we would need, for example, to bring true personalization to healthcare — from drug discovery to treatment planning. The only way to address this challenge is to use simulation (in-silico experiments) to augment biological experiments. This means that we need to integrate machine learning (ML) workflows into biological research as a top priority. With the artificial intelligence industry booming and with the development of computer chips designed specifically for machine learning workloads, we will soon be able to run millions of in-silico experiments in a matter of days for the same cost that a single live experiment takes to run over a period of months. Of course, simulated experiments suffer from a lack of fidelity relative to biological experiments. One way to overcome this is to run the in-silico experiments in vitro or in vivo to get the most interesting results. Integrating in-silico data from vitro/vivo experiments leads to a feedback loop where results of in vitro/vivo experiments become training data for future predictions, leading to increased accuracies and reduced experimental costs in the long run. Several academic groups and companies are already using such approaches and have reduced costs by 50 times. This approach of using machine learning models to select experiments and to consistently feed experimental data to ML training should become an industry standard. Masters of the universe As Steve Jobs once famously said, “The people who are crazy enough to think they can change the world are the ones who do.” The last two decades have brought epic technological advancements in genome sequencing , software development, and machine learning. All these advancements are immediately applicable to the field of biology. All of us have the chance to participate and to create products that can significantly improve conditions for humanity as a whole. Biology needs software engineers, more infrastructure engineers, and more machine learning engineers. Without their help, it will take decades to digitize biology. The main challenge is that biology as a domain is so complex that it intimidates people. In this sense, biology reminds me of computer science in the late 80s, where developers needed to know electrical engineering in order to develop software. For anyone in the software industry, perhaps I can suggest a different way of viewing this complexity: Think of the complexity of biology as an opportunity rather than an insurmountable challenge. Computing and software have become powerful enough to switch us into an entire new gear of biological understanding. You are the first generation of programmers to have this opportunity. Grab it with both arms. Bring your skills, your intelligence, and your expertise to biology. Help biologists to scale the capacity of technologies like CRISPR, single-cell genomics, immunology, and cell engineering. Help discover new treatments for cancer, Alzheimer’s, and so many other conditions against which we have been powerless for millennia. Until now. Luis Voloch is cofounder and Chief Technology Officer at Immunai DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,643
2,021
"How to use the right data for growth marketing in a privacy-centric world | VentureBeat"
"https://venturebeat.com/datadecisionmakers/how-to-use-the-right-data-for-growth-marketing-in-a-privacy-centric-world"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Community How to use the right data for growth marketing in a privacy-centric world Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. This article was contributed by Jonathan Martinez Data has always been a critical component of the craft of growth marketing. From being able to pinpoint the time of day when users convert best on Facebook (or as the company is now known, Meta ), to testing the best email subject lines (and the ongoing debate of whether to use emojis), to hyper-analyzing each step of the product funnel to eliminate drop-off. This is all changing because of our shift to a privacy-centric world. Companies like Apple are now enforcing stricter user-level data sharing. States are increasingly looking towards privacy laws like the California Consumer Privacy Act (CCPA), which gives consumers more control over their personal information which businesses can collect. Due to this ever-changing landscape, data that growth marketers have traditionally used to make optimization decisions has become increasingly inaccessible. Solutions are needed now that we have far fewer data to work with. Without the critical data component of growth marketing, startups and corporations are equally impacted. So, how do we adapt to an environment which is far less data-rich than before? Data degradation ​​First, we need to take a step back to a time before recent privacy changes to understand why this is a transitional time for growth marketing. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! We’ve largely lived in a world where access to user-level data has been abundant and rich. But being able to analyze conversions to this level of specificity: e.g., a 34-year-old male in San Francisco in the top 10% HHI, is swiftly coming to an end. Governments and companies are slowly but concretely making changes to the types of data that ad platforms can send and receive. In one of the latest examples of this industry shift , Apple forced app developers to display prompts asking users whether they’d like to share their data with 3rd parties in iOS14. According to Statista , the opt-in rate for this specific prompt is 21% as of September 6, 2021, which is unsurprising. This means that 79% of users are unwilling to provide their data to ad platforms in this instance, who then can’t pass the data back to growth marketers. It’s a trickle-down effect for growth marketers, who have historically relied on this information to make data-driven decisions. Incrementality testing So, how can growth marketers solve for not having as much data at their disposal? A concept which I’ve always considered vital for growth marketing must now become ubiquitous for every business. The most accurate way to measure and understand marketing efforts is by conducting incrementality tests. The easiest way to explain incrementality testing is that tests show the true lift in conversions from marketing efforts. For example, how many users would have converted last week if a specific marketing campaign were turned off? This type of testing allows us to determine how our efforts are impacting consumer behavior. There are multiple ways to set these tests up, from hacky to precision accuracy. For the sake of keeping this discussion relevant to startups without necessitating data science swarms, the following approach will describe a simplified method that will still get reliable results. Above: Example illustration of test tubes The test tubes in the illustration above provide a visual representation of an incrementality test. Both tubes show the number of conversions (i.e., purchases) that a company acquires when a Facebook campaign is turned on or off. The first test tube shows the Facebook campaign on, and the second test tube shows the Facebook campaign off. We can observe that the first tube had approximately 40% more conversions (or green fill), which would be the “incremental” lift in conversions. These are the conversions that resulted because the Facebook campaign was on. So, how do we go about setting up this incrementality test? Data for growth marketing: an incrementality test example Now that we know the basics and reasoning for incrementality, let’s dive into an example test. In this test, we’ll determine how incremental a Facebook campaign might be. I like to call the setup for this test, “the light switch method”. It involves measuring conversions when the campaign is live, switching the campaign off, and then measuring the conversions again. During testing While the test is running, it’s imperative that no changes are made to the campaign or other channels and initiatives which may be live. If you’re running this test, and then launch something like a free one-month promotion, the conversions would likely spike and invalidate the data. This method relies on keeping everything consistent throughout the testing period, and across your growth marketing efforts. Leveraging results Above: Simple incrementality example analysis. In the example above, the Facebook campaign was live between January 8 to January 14, which resulted in 150 sign-ups. The campaign was then switched off between January 15 to 24, and 50 sign-ups still occurred within this second period. Cost/sign up: $16.66 Cost/incremental sign up: $50.00 These results tell us our Facebook campaign is 200% incremental. It’s a simple example, but this test provides us with our incremental costs, which we can now apply and compare against other channels. Although there is less user-level data to optimize Facebook campaigns, incrementality tests are still a powerful way to understand the effectiveness of the marketing spend. The power of scalar utilization As times change and with politicians continuing to enact legislation forcing company action, I believe we’re moving towards an incremental and scalar-based attribution model in growth marketing. Last-click attribution will be a concept of the past, as this relies heavily on user-level data. A scalar, as defined by Encyclopedia Britannica, is “…a physical quantity which is completely described by its magnitude”. In growth marketing, the use of scalars helps increase or decrease a channel or growth medium’s metrics, based on tests or historical data. There is a myriad of uses for scalars, but let’s analyze a timely example. With the introduction of iOS14 and Apple’s SKAD timer effectively cutting all attribution data after a 24-hour window, app-first companies have scrambled for solutions to mitigate this loss of information. This is a perfect use case for implementing a scalar. How does one calculate, analyze and implement a scalar to get necessary data? Take the following example, using Google UAC efforts, which have been impacted because of Apple’s iOS14 SKAD timer. Pre-iOS14, the sign-up> purchase CVR was 32%. With the introduction of iOS14+, the CVR has now dropped to 8%. This is a huge drop, especially considering nothing else has changed in the app flow or channel tactics. Above: Example data for Google UAC campaign pre-scalar. The CACs increased from $125 to $526, which would make this channel seem inefficient, potentially leading to reduced budgets. But instead of outright pausing all iOS Google UAC campaigns, a scalar should be applied to account for data loss. Above: Example data for Google UAC campaign post-scalar. We can divide 80/19, which is the delta in purchases pre-and-post iOS14. We land with a scalar of ~4.21, which we can then multiply our post-iOS14 purchases by, or 19 x 4.21 = 79.99. Our sign-up> purchase CVR is now normalized back at 32%, which is what consumer behavior typically is on our app. There are other ways to implement scalars for growth marketing efforts, including leveraging historical data to inform scalars or using data sources that haven’t been impacted — e.g., Android campaigns. By using this method to make metrics more accurate, it helps prevent a lights-out scenario for seemingly low-performing channels. Looking ahead at the future of data for growth marketing Many see the degradation of data as a doomsday event. But I see it as an opportunity to become more innovative and to move ahead of the competition. Leveraging incrementality tests and scalars are just two of the strategies that focus on data for growth marketing, which can and should be employed to validate growth marketing spend. If anything, this increasingly privacy-centric era will force us to realize the true impact of our data and growth marketing efforts like never before. As we look towards 2022, policies will continue to become enforced by governments and corporations alike. Google has already made it known that 3rd party cookies will become obsolete on Chrome in 2022-2023. They will likely also follow in the footsteps of Apple’s iOS enforcement. But as new policies are enforced, new strategies will be equally needed, and tools from mobile measurement partners or business intelligence platforms should be leveraged. Take this new era in growth marketing to get crafty, because those who do, will eventually end up on top. Jonathan is a former YouTuber, UC Berkeley alum, and growth marketing nerd who’s helped scale Uber, Postmates, Chime, and various startups. DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,644
2,021
"Collaborative financial planning and analysis platform Abacum raises $25M | VentureBeat"
"https://venturebeat.com/data-infrastructure/collaborative-financial-planning-and-analysis-platform-abacum-raises-25m"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Collaborative financial planning and analysis platform Abacum raises $25M Share on Facebook Share on X Share on LinkedIn Abacum founders: Julio Martínez (CEO) and Jorge Lluch (COO) Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Financial planning and analysis (FP&A) is a long-established discipline that enables businesses to inspect their financial and operational data, benchmark where they’re at, and forecast what’s around the corner. But despite the advent of the cloud and countless software products designed to aid the FP&A process, companies still often find themselves in a spreadsheet death spiral, involving copying and pasting data across documents, and performing error-prone manual data transformations. The problem, ultimately, is that data typically lives inside lots of different systems, while the myriad of people inside a company that need access to FP&A-enabled insights work in their own microcosms. It’s a recipe for chaos. This is where Abacum is setting out to help, with a cloud-based FP&A platform built specifically with remote collaboration in mind. Founded in April 2020, YC alum Abacum wants to “simplify” FP&A for scale-up companies such as Typeform , with pre-built connectors for most of the major systems that finance teams work with, such as Netsuite, Gusto, Salesforce, Looker, and Google Sheets. Rather than replacing these various software systems, Abacum serves as a data integrator that sits on top, accessed via a standalone web-based interface. “Our goal is to provide teams with a smarter and more intuitive solution that can adapt and grow with a company as they continue to evolve,” Abacum cofounder and CEO Julio Martínez told VentureBeat. “Abacum was designed to be a fully collaborative tool that helps finance teams work with other stakeholders in the financial planning process.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! To help bring its technology to more companies, Abacum today announced that it has raised $25 million in a series A round of funding led by European venture capital (VC) juggernaut Atomico. The inner workings of revamped financial planning and analysis With Abacum, users can manage and synchronize all their financial and operational data from a centralized location; generate insights and customized tables; and produce all manner of graphs, tables, and charts. Above: Abacum in action However, collaboration is where Abacum is looking to set itself apart, with built-in features to facilitate dialog between all the various parties involved in finance or operational data. Indeed, Abacum supports comments around specific pieces of data, with tagging that notifies others if their input is required and automatic reminders rather than having to chase them up. It’s also possible to create dedicated “spaces” so that specific teams or departments can discuss and collaborate around data that is relevant to them only. Above: Abacum: Spaces There has been a flurry of activity across the FP&A space this year, with young upstarts such as DataRails , Cube Software , and Pigment all raising sizable investments from the VC fray. Then there are the many legacy players to contend with too, such as Anaplan , Adaptive Insights (which Workday acquired for north of $1.5 billion in 2018), and Oracle-owned Hyperion. From Abacum’s perspective, the legacy players, in particular, are ripe for disruption, having “lost their competitive advantage” due to their rigidity and “overstuffed functionalities,” according to Martínez. “For the speed and scale of today’s tech companies, these legacy players usually can’t adapt quickly enough to meet their needs,” he said. That, in part at least, is why we’re starting to see a slew of newer companies enter the FP&A fraternity. However, a combination of external factors is also making companies realize the importance of making good use of their financial and operational data — at a time when there are so many uncertainties permeating industries. “The pandemic really opened the eyes of many companies, that forecasting exists for a reason,” Martínez explained. “Proper financial modeling and scenario planning needs to be at the forefront of all forecasting and business strategies. This really resonates with our target audience where, in the tech space, there is a lot of uncertainty in the market, and they need to always be prepared for any and all situations.” Disrupting the fintech space While many people may align “ fintech ” with modern banking and payments solutions such as N26 or Stripe , Martínez thinks that “operational fintech” has been a little slower to gain traction, which is why many financial professionals have so far persevered with traditional spreadsheet-led approaches. “These [legacy approaches] are great, but none of them answer the needs and pains of scaling finance teams,” he said. “And so, this space is ripe for disruption, and many founders have seen that FP&A has so much potential to be a value driver.” Prior to now, Abacum had raised $7 million, and besides lead backer Atomico, the company’s series A round included investments from Creandum, FJ Labs, S16VC, among others. Since its seed funding back in April , Abacum — which has offices in Barcelona and New York — reports that it has grown its team from 15 to 70, and also counts hires from companies such as Microsoft, Amazon Web Services (AWS), and Mixpanel. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,645
2,021
"Amazon unveils new security features for AWS Lake Formation | VentureBeat"
"https://venturebeat.com/data-infrastructure/amazon-unveils-new-security-features-for-aws-lake-formation"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon unveils new security features for AWS Lake Formation Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Amazon Web Services (AWS) today announced new features for providing secure access to sensitive data in the AWS Lake Formation data lake service, with the introduction of row- and cell-level security capabilities. AWS Lake Formation enables collection and cataloging of data from databases and object storage, but it’s up to users to determine the best way to secure access to different slices of data. To make that easier, row- and cell-level security capabilities for Lake Formation are now generally available, AWS’s CEO Adam Selipsky said today during a keynote at the AWS re:Invent 2021 conference. To get customized access to slices of data, users have previously had to create and manage multiple copies of the data, keep all the copies in sync, and manage “complex” data pipelines, Selipsky said. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Reducing complexity of data lakes Users of AWS Lake Formation had been asking for a more direct way to govern access to data lakes , while eliminating the “heavy lifting” associated with providing secure access, he said. With the new updates announced today, “now you can enforce access controls for individual rows and cells,” Selipsky said. “Lake Formation automatically filters data and reveals only the data permitted by your policy to authorized users.” For securing sales data, for instance, rather than creating multiple tables for each sales teams and country, “you just define a set of policies that provide access to specific rows for specific users—without having to duplicate data or build data pipelines,” he said. “It puts the right data in the hands of the right people—and only the right people.” In a blog post , Danilo Poccia, a chief evangelist at AWS, said that access can be controlled to certain rows and columns both in query results and within AWS Glue ETL jobs. “In this way, you don’t have to create (and keep updated) subsets of your data for different roles and legislations,” Poccia said. This works both for governed and traditional tables in S3, he said in the post. Cloud security challenges The security updates from AWS come as enterprises continue their accelerated shift to the cloud, even as security processes have struggled to keep up. A recent survey of cloud engineering professionals found that 36% of organizations suffered a serious cloud security data leak or a breach in the past 12 months. On Monday, AWS announced several new features for improving and automating the management of vulnerabilities on its platform, in response to evolving cloud security requirements. Newly added capabilities for the Amazon Inspector service will meet the “critical need to detect and remediate at speed” in order to secure cloud workloads, according to AWS. The capabilities include assessment scans that are continual and automated — taking the place of manual scans that occur only periodically — along with automated resource discovery. AWS re:Invent 2021 takes place through Friday, both in-person in Las Vegas and online. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,646
2,021
"Myty maker Off raises $3.5M to create social avatars for 'microverses' | VentureBeat"
"https://venturebeat.com/commerce/myty-maker-off-raises-3-5m-to-create-social-avatars-for-microverses"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Myty maker Off raises $3.5M to create social avatars for ‘microverses’ Share on Facebook Share on X Share on LinkedIn Off is the maker of Myty. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Off, which is creating a social avatar “microverse” platform dubbed Myty , has raised $3.5 million in seed funding. The co-leaders of the round were Hashed (a crypto investor in South Korea) and Collab+Currency. Other investors included Bitkraft Ventures, Electric Capital, Coinbase Ventures, and SamsungNext. The investment will be used towards the development of their platform, supporting more nonfungible token (NFT) avatars and adding more social utilities to avatars. The Singapore and South Korea-based company is competing in a crowded space with rivals such as celebrity social avatar maker Genies. Off’s goal is to create an open ecosystem for avatar-based “microverses,” or small versions of the metaverse , the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. The company said that microverses are a kind of mod or space where users of NFT avatars can interact with each other in different social contexts. For instance, there could be spaces such as karaoke places where people can sing and dance, stand-up comedy clubs, and conference rooms. These will provide different visual and aural environments as well as the context – purpose, goals, rules, competition, collaboration. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! “Our big vision of the metaverse is for people to break free from physical reality so they can express themselves honestly and authentically,” said Jinwoo Park, team lead at Off and a cofounder of Hashed, in a statement. “We think avatars will be a key component for more people to enter the metaverse to represent their second persona and will enable another NFT utility.” The company recently launched Myty, a decentralized social platform where people can escape the limitations of the physical world using NFT avatars, Off said. The Myty Camera is a free application that converts CryptoPunks NFTs into avatars. The Myty Camera enables users to show up as the visuals of their own NFT avatars in Discord, Zoom and any other webcam-based applications. “Off brings NFTs to real life. [The founders] are crypto veterans with a decade of experience in scaling consumer products for mass adoption,” said Baek Kim, principal at Hashed, in a statement. “The team is building the framework and toolkits to create NFTs that truly represent ourselves in the metaverse. Their first product The Myty Camera is adding a fun utility to iconic NFTs by transforming them into avatars that can mimic the owner’s motion and facial expressions to deliver emotion over Zoom calls.” Off’s next job is “Ghosts Project,” a collection of NFT avatars designed by the Off team and MrMisang, the top artist in SuperRare. Ghosts Project, which is a prequel to MrMisang’s original series, Modern Life is Rubbish, is being optimized for face tracking, motion tracking and is designed to show the full capacity of Myty’s avatars full-body tracking including nodding, raising hands, and various face emotional expressions. GhostsProject is currently building Twitter and Discord communities and planning an NFT avatar sale. Ghosts will be supported by Myty Camera after the sales. “While many types of NFT exist today, the Web 3 industry is still constrained in the ways it can interact and engage with these NFTs on a social level,” said Derek Edws, partner at Collab+Currency, in a statement. “PFPs, or profile picture projects, are emergent, global, social clubs sitting at the intersection of on-chain identity, community, and culture. The Myty team is focused squarely at this intersection. Our team at Collab+Currency couldn’t be more excited to back the Myty team on their NFT journey.” Off’s goal is to provide SDK, API, incentive mechanism to the third-party developers so that they can create countless interoperable microverses for NFT avatars and generate profits. Off started its closed beta of Myty Camera with a handful of CryptoPunks holders a month ago. The Ghosts Project (the first Myty Original pfp project) also launched its website and discord server (https://discord.gg/myty) in the last month. Now there are 20,000 discord members who joined the community and actively participate in the ticket distribution event called Pioneers Program. Besides Park, the founders are Kyungjin “Kenny” Kim and Hyun Youn “Dan” Lee. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,647
2,021
"Visus Therapeutics Announces Positive Topline Clinical Data from Phase 2 VIVID Study of BRIMOCHOL for the Treatment of Presbyopia | VentureBeat"
"https://venturebeat.com/business/visus-therapeutics-announces-positive-topline-clinical-data-from-phase-2-vivid-study-of-brimochol-for-the-treatment-of-presbyopia"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release Visus Therapeutics Announces Positive Topline Clinical Data from Phase 2 VIVID Study of BRIMOCHOL for the Treatment of Presbyopia Share on Facebook Share on X Share on LinkedIn Three Proprietary Formulations Achieved a Three-Line Improvement in Binocular Near Visual Acuity, without Losing One Line in Distance Visual Acuity with a Minimum Responder Rate of 83% at One Hour; a Minimum of 35% of Subjects Met the Same Endpoint at 9 Hours Positive Outcomes of Phase 2 VIVID Study Clear Path to Advance to Phase 3 Pivotal Studies SEATTLE–(BUSINESS WIRE)–November 30, 2021– Visus Therapeutics Inc., a clinical-stage pharmaceutical company focused on developing innovative ophthalmic therapies to improve vision for people around the world, today reported positive topline results from VIVID, the company’s Phase 2 study of three novel topical ophthalmic formulations under investigation for the treatment of presbyopia. All three investigational candidates studied in VIVID achieved the endpoint of three lines of improvement in binocular near visual acuity without losing one line of distance vision with a minimum responder rate of 83% at one hour. A minimum of 35% of subjects in the study met this same endpoint at nine hours in all three formulations. Additionally, all three formulations were well-tolerated and exhibited favorable safety profiles. Based on these positive outcomes, the company plans to commence Phase 3 pivotal trials shortly. “We are very encouraged by the VIVID study topline clinical data, which demonstrate that our clinical development program has delivered drug candidates which provide a durable improvement in visual acuity with favorable tolerability and safety profiles,” said Rhett Schiffman, M.D., M.S., M.H.S.A., co-founder, chief medical officer and head of research and development at Visus Therapeutics. “We know that presbyopia has a profound impact on quality of life for patients. These positive results provide further confidence that we are well positioned to bring to market the longest-lasting eye drop in the presbyopia category, which would be a meaningful breakthrough treatment for these individuals. We are excited to initiate our Phase 3 pivotal trials.” VIVID Phase 2 Trial Topline Highlights: In the per protocol population, a minimum of 83% of subjects treated with BRIMOCHOL, BRIMOCHOL F or Carbachol F achieved the endpoint of 3 lines of improvement in binocular near visual acuity under mesopic conditions without losing 1 line of distance vision at 1 hour. A minimum of 82%, 52% and 35% of subjects met this same endpoint at 3, 7 and 9 hours, respectively. In key secondary endpoints, BRIMOCHOL and BRIMOCHOL F achieved a mean improvement in binocular near visual acuity of a minimum of 18 ETDRS letters, almost 4 lines, as early as 30 minutes and a minimum of 12 letters at 9 hours. BRIMOCHOL, BRIMOCHOL F and Carbachol F were well-tolerated with no unexpected adverse events. Adverse events exceeding 5% included temporary burning and stinging upon instillation, headache and brow ache. No serious adverse events were reported. “The successful completion of the VIVID study marks an important milestone for Visus Therapeutics,” said Ben Bergo, co-founder and chief executive officer at Visus Therapeutics. “In light of the recent FDA approval of Allergan’s VUITY TM , the first pharmacologic approved for the treatment of presbyopia, it is truly exciting to see this category come to fruition. We are pleased the VIVID study data exceeded our expectations, demonstrating a clear opportunity to commercialize long-acting miotic formulations.” The VIVID clinical trial ( NCT04774237 ) was a double-masked, randomized, dose-ranging, multi-center, three-arm crossover study designed to evaluate the safety and efficacy of fixed-dose combinations of carbachol and brimonidine tartrate (BRIMOCHOL and BRIMOCHOL F) compared to a similarly formulated preservative-free carbachol. The trial enrolled 85 subjects, ages 45 to 80, with emmetropic phakic and pseudophakic presbyopia at three U.S. sites. The trial was not designed to achieve statistical significance in any of these study populations. Full results from the VIVID study will be presented at future medical meetings. “These are very encouraging preliminary results for the millions of people living with presbyopia and the eye care professionals who treat them,” said Eric D. Donnenfeld, M.D., founding partner of Ophthalmic Consultants of Long Island and Clinical Professor of Ophthalmology at NYU. “Eye drops are emerging as an important new treatment option for correcting age-related loss of near vision. I see tremendous clinical value in a long-acting, preservative-free eye drop that can improve near vision throughout the workday while avoiding the potential toxicity of preservatives. What’s really exciting is that these study results demonstrate the potential of drug candidates that can be studied in a larger treatment population.” “Currently, drugstore reading glasses are the most common treatment option for people as they lose their near vision with age. This means that many people aren’t getting regular comprehensive eye exams, and as such, more serious eye conditions are going undiagnosed,” said David Evans, O.D., an optometrist at Total Eye Care in Memphis, Tenn., and a clinical investigator in the VIVID trial. “The availability of a once-daily, preservative-free eye drop such as those evaluated in the VIVID study could attract millions of people into eye care practices nationwide, creating opportunities to not only improve visual performance but also advance eye health overall for this patient population.” About Presbyopia Presbyopia is the loss of near vision associated with aging, making it difficult to perform tasks like reading fine print. It typically begins when adults are in their 40s and becomes almost universal by age 50. 1 Presbyopia impacts billions of people globally with approximately 128 million adults affected in the U.S. alone. 2,3 Reading glasses are the most common solution for near-vision correction. However, many people find glasses inconvenient or prefer not to wear them for aesthetic reasons. About Visus Therapeutics With offices in Seattle, Wash., and Irvine, Calif., Visus Therapeutics is a clinical-stage pharmaceutical company focused on developing innovative ophthalmic therapies to improve vision for people around the world. The company is developing novel miotic formulations for a once-daily eye drop to correct the loss of near vision associated with presbyopia. In parallel, Visus Therapeutics is focused on advancing its pipeline of early-stage ophthalmic product candidates with applications in ocular surface disease, glaucoma and age-related macular degeneration. Additionally, Visus Therapeutics has entered into a worldwide exclusive licensing agreement with DelSiTech Ltd, a leader in advanced, biodegradable, silica-based, controlled-release materials, to develop novel drug delivery technology that can help optimize the clinical benefit of ophthalmic therapies. For more information, visit: www.visustx.com and follow us on LinkedIn , Twitter ( @VisusTx ) and Instagram. 1 U.S. Census Bureau. Retrieved September 7, 2019 from http://www.census.gov. 2 Zebardast et al. The Prevalence and Demographic Associations of Presenting Near-Vision Impairment Among Adults Living in the United States. Am J Ophthalmol. 2017;174:134-144. 3 U.S. Census Bureau. Table 9. Washington: Population Division. 2014. View source version on businesswire.com: https://www.businesswire.com/news/home/20211130005414/en/ Media Business & Biotech Press Doug Hochstedler [email protected] (317) 645-8665 Investor Relations Paul Sagan [email protected] (617) 865-0041 Eye Care Trade Press Michele Gray [email protected] (917) 449-9250 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,648
2,021
"Quinyx lands $50M to help companies find gig workers | VentureBeat"
"https://venturebeat.com/business/quinyx-lands-50m-to-help-companies-find-gig-workers"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Quinyx lands $50M to help companies find gig workers Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Quinyx , a platform for gig and hourly work scheduling and time reporting, today announced that it raised $50 million in funding led by Battery Ventures at a $550 million valuation. The tranche brings the Stockholm, Sweden-based startup’s total capital to more than $89 million, which Quinyx says it’ll put toward hiring as well as product research and development. In a 2016 Adobe survey , 56% of office workers predicted that everyone will do some gig work in the future. Companies are on board, with a 2018 Korn Ferry report showing that 60% of HR professionals are hiring more contract workers and 42% plan to hire even more in the future. However, challenges abound with contracting, which can offer less predictability than the traditional hiring cycle. For example, gig roles tend to involve limited knowledge transfer, which might in the long run prove more costly than hiring salaried employees. Quinyx aims to address some of the challenges in gig and hourly work by simplifying scheduling, time reporting, communication, task management, budgeting, and forecasting. CEO Erik Fjellborg founded the company in 2006 while working at McDonald’s, where he saw how difficult it was for managers to sort shifts manually. Fjellborg built Quinyx’s initial software and piloted it at a McDonald’s restaurant in Örebro, Sweden, after which he expanded the pilot to McDonald’s restaurants in Sweden, Denmark, and the U.K. “Quinyx has experienced colossal growth over the past year,” Fjellborg told VentureBeat via email. “This success reflects the strenuous demand for organizations hiring an hourly workforce to do more with less as a labor shortage puts a squeeze on businesses’ operations.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Matching workers with shifts Quinyx — which claims to have millions of users — delivers scheduling, budget forecasting, and shift planning and swapping. Through its Webpunch product, a dashboard for time clocking, employees can record their time worked as an alternative to traditional time clocks. Quinyx also offers data processing and digital self-monitoring features as well as Forecast, a product designed to help customers anticipate their workforce needs. Other capabilities span demand forecasting, strategic planning, and labor optimization in addition to worker surveys that “give employers the ability to recognize and reward their peers with virtual badges.” “Accurate and effective forecasting can mean the difference between turning a profit and making a loss. Our technology does the heavy number crunching and will give you the total labor costs per region and location,” Fjellborg said. “[With] Quinyx, you only have to choose your business objectives — our technology does the rest. [Customers can make] automated schedules in minutes and minimize manual changes mid-operations.” Expanding market Quinyx, which has over 300 employees and offices across the U.S, U.K , Sweden, Finland, Germany, Norway, Denmark, and the Netherlands, is benefiting from a shift management market that’s anticipated to grow steeply as the pandemic continues. Markets and Markets forecasts that the workforce management segment will grow from $6.0 billion in 2020 to $9.3 billion by 2025. Quinyx competes with When I Work , Spur , Blue Yonder, and Deputy , among others. But Quinyx has managed to carve out a slice of the expanding market, with over 1,000 customers including Oatly, Sysco, Virgin Atlantic, Palace Entertainment, IHG, Swarovski, and DHL. “We are excited about the funding which will enable us to accelerate innovation and expansion — supporting our mission to better the lives of millions of hourly workers,” Fjellborg said in a statement. “Today’s announcement is a game-changer for the workforce management industry and businesses at large.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,649
2,021
"Microsoft launches fully managed Azure Load Testing service | VentureBeat"
"https://venturebeat.com/business/microsoft-launches-fully-managed-azure-load-testing-service"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft launches fully managed Azure Load Testing service Share on Facebook Share on X Share on LinkedIn View of a Microsoft logo on March 10, 2021, in New York. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Let the OSS Enterprise newsletter guide your open source journey! Sign up here. Microsoft is rolling out a new fully managed load testing service for Azure, helping quality assurance (QA) testers and developers optimize their app’s performance and scalability. Load testing is a software testing technique that involves deliberately putting high demands on a website, database, network, or application to see how it responds — it essentially helps determine how well a system copes with a lot of concurrent users. Companies can configure different load tests to replicate different scenarios, for example to see what happens if X number of people tried to checkout from their online store simultaneously. Assuring quality Load testing fits into the broader software performance testing and QA spheres, which might include everything from cross-platform web testing to continuous profiling for cutting cloud bills — it’s all about ensuring that an application is robust and optimized for every potential scenario, minimizing outages and downtime for software in production environments. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Companies have a number of options at their disposal for conducting load testing on their systems, including open source tools such as Apache JMeter , or LoadRunner from Micro Focus. And earlier this year, observability giant Grafana Labs acquired the creators of an open source load testing tool called K6. But as its name suggests, Azure Load Testing is very much designed with Azure customers in mind. This includes integrated Azure resource management and billing, and integrations with related products such as Azure Monitor , Microsoft’s monitoring tool for applications, infrastructure, and networks. Above: Performance insights in the Azure Load Testing dashboard The open source factor It’s worth noting that Microsoft previously offered the Azure Devops load testing service, which it deprecated in March last year. At the time, Microsoft cited limited adoption for the service while acknowledging that there were better alternatives on the market. So, what has changed in the intervening 20 months? “The deprecation did not mean Microsoft thought load testing was not important — we believe our previous offering needed a reset to better meet the needs of teams building reliable, modern cloud apps,” Mandy Whaley, Microsoft’s partner director of product for Azure dev tools, told VentureBeat. And so Microsoft took what it learned from its previous load testing endeavors and started from the beginning. For its initial “preview” launch, Azure Load Testing will support companies using Apache JMeter, which is in keeping with Microsoft’s broader open source adoption over the past seven years or so. “Azure Load Testing is designed from the ground up, embraces open source, and has a specific focus on Azure customers and delivering Azure-optimized capabilities,” Whaley added. “We believe this is where we can provide differentiation in the market and where we can provide the greatest value for our customers.” This also helps to highlight one of the key benefits of open source software — it enables companies to build on top of existing technologies that have traction and are widely understood by the software development fraternity. And by packaging a managed service on top of JMeter, this saves companies from having to manage all the infrastructure side of load testing themselves. “Since Apache JMeter is a highly extensible, popular open source tool with strong community backing, Azure Load Testing public preview supports JMeter to ensure users will be able to reuse their existing skills and scripts,” Whaley said. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,650
2,021
"LabGenius Appoints Dr. Gino Van Heeke as Chief Scientific Officer | VentureBeat"
"https://venturebeat.com/business/labgenius-appoints-dr-gino-van-heeke-as-chief-scientific-officer"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release LabGenius Appoints Dr. Gino Van Heeke as Chief Scientific Officer Share on Facebook Share on X Share on LinkedIn With over 20 years of industry experience, the former Senior Director, Discovery and Early Development at Ablynx, will lead LabGenius’ drug discovery and pre-clinical development activities. LONDON–(BUSINESS WIRE)–November 30, 2021– LabGenius, the machine learning (ML)-driven protein engineering company, today announced the appointment of Gino Van Heeke Ph.D. as Chief Scientific Officer (CSO). In this role, Gino will leverage his wealth of drug discovery and development experience to direct LabGenius’ scientific strategy. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20211129005436/en/ Dr. Gino Van Heeke Appointed as LabGenius’ CSO (Photo: Business Wire) “I’m extremely pleased to welcome Gino to the company as CSO,” said Dr. James Field, Founder and CEO of LabGenius. “Gino is an accomplished scientific leader with an impressive track record of successfully advancing drug candidates into pre-clinical and clinical development.” Gino joins LabGenius from Engitix Therapeutics, where, as CSO, he established a portfolio of projects and initiated the company’s first drug discovery campaign. Before this, Gino was Senior Director, Discovery and Early Development at Ablynx where he led the discovery of NANOBODY ® drug candidates and their transition to clinical development. Prior to his move into biotech, Gino held several senior positions at Novartis Institutes for Biomedical Research over a 22-year tenure, including Executive Director for Biologics. Speaking about his appointment, Dr. Gino Van Heeke said, “The ML-driven protein engineering platform that the team at LabGenius has expertly built presents a tremendous opportunity to revolutionize the way in which drugs are discovered. With solid foundations in place, I am delighted to join the company and apply my drug discovery knowledge and experience to ensure we use our unique platform in the most effective way possible.” Dr. Edwin Moses, LabGenius Chairman commented “Gino’s expertise in translational research, coupled with LabGenius’ ML-driven drug discovery platform, is an excellent combination which I believe will further accelerate the company’s successful development.” – end – About LabGenius LabGenius is a leading ML-driven protein engineering company, accelerating the discovery of novel therapeutics through the co-optimization of therapeutically valuable properties. The company’s discovery platform, EVA TM , integrates several cutting-edge technologies drawn from the fields of computer science, robotic automation and synthetic biology. Headquartered in London, UK, the LabGenius team includes experts in protein engineering, synthetic biology, software engineering, data science and robotic automation. For more information, please visit www.labgeni.us , or connect on Twitter , Facebook , and LinkedIn. View source version on businesswire.com: https://www.businesswire.com/news/home/20211129005436/en/ For more information, please contact: Media enquiries: Communications Lead Lucy Shaw [email protected] Business development: Chief Operating Officer Didier Landais [email protected] VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,651
2,021
"Groundswell Announces A Total of $15M in Seed Funding To Democratize Philanthropy | VentureBeat"
"https://venturebeat.com/business/groundswell-announces-a-total-of-15m-in-seed-funding-to-democratize-philanthropy"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release Groundswell Announces A Total of $15M in Seed Funding To Democratize Philanthropy Share on Facebook Share on X Share on LinkedIn The platform is combining HR, Fintech, and philanthropy to spearhead a new “Philanthropy-as-a-Service” sector LOS ANGELES–(BUSINESS WIRE)–November 30, 2021– Groundswell , the Philanthropy-as-a-Service platform that’s empowering employees with their own personal donor-advised funds (DAFs), today announced that they have raised a total of $15M in seed investment funding. The round was led by GV (formerly Google Ventures), with participation from Human Ventures , Moonshots Capital , Felicis Ventures , and Core Innovation Capital. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20211130005365/en/ https://groundswell.io/ (Graphic: Business Wire) Groundswell is revolutionizing how companies think about employee compensation by decentralizing corporate philanthropy and using it to fund personal donor-advised funds for each employee. Like the 401k did for retirement, Groundswell puts employees in the driver’s seat of corporate philanthropy and empowers them to create worldwide impact. Employee users are also provided with tools historically reserved for the ultra-rich, including tax-free investment opportunities, customized giving opportunities, frictionless donation options, and centralized impact reporting. The platform aims to become a ubiquitous component of employee benefits packages and charitable giving. As the war for talent heats up, Groundswell will help companies authentically unlock corporate purpose alongside profit by channeling corporate social responsibility efforts through employees. By directly gifting or matching contributions into employee DAFs, companies are empowering their workforce to find the solutions they want to see to the problems they feel are most pressing – a powerful acknowledgement of the diversity of employees’ backgrounds. Additionally, as the shift to remote work accelerates, Groundswell allows workers to be the change in their community – regardless of where that community is. Considering that more Americans give to charity each year than contribute to a 401(k), Groundswell’s approach will provide a financial boost to a significant portion of the workforce. “We want to create a new category of service that combines the best aspects of HR tech, fintech, and philanthropy,” says Jake Wood, Co-Founder & CEO, who previously led the highly regarded nonprofit Team Rubicon. “Our Philanthropy-as-a-Service approach is democratizing philanthropy for the masses, by enabling them to give better, smarter, easier, and more intentionally. We’re excited to have found investors who understand this vision, and want to work alongside us as we make it a reality.” Groundswell’s platform was formally announced in September 2021 following six months of behind the scenes development in partnership with Human Ventures. Numerous corporations are lined up to launch Groundswell in Q1 of 2022. Groundswell will use the financing to add additional engineers and product staff while laying the groundwork for an aggressive go-to-market strategy. “Groundswell emerges at a time when a rising generation of workers want their employers both to care about social responsibility and enable action on causes that matter to them,” said M.G. Siegler, General Partner at GV. “CEO Jake Wood is a top-class entrepreneur with a strong background in execution and logistics needed to make the Groundswell vision a reality. We’re thrilled to partner with Jake and the entire team as they reimagine philanthropic giving.” For more information about Groundswell, please visit https://www.groundswell.io/. About Groundswell democratizes philanthropic giving. It makes charity an employee benefit that unlocks a company’s purpose by decentralizing corporate philanthropy, giving employees the power of a personal foundation in the palm of their hand. The company’s vision is to build a platform that gets every solution funded and every problem solved. View source version on businesswire.com: https://www.businesswire.com/news/home/20211130005365/en/ Meg Vandervort [email protected] VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,652
2,022
"Deloitte predicts 2022 chip shortage, AI regulation, entertainment churn, and sustainable push | VentureBeat"
"https://venturebeat.com/business/deloitte-predicts-2022-chip-shortage-ai-regulation-entertainment-churn-and-sustainable-push"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Deloitte predicts 2022 chip shortage, AI regulation, entertainment churn, and sustainable push Share on Facebook Share on X Share on LinkedIn Deloitte has its 2022 tech predictions ready. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. The new year will see a continued chip shortage, an increase in regulation for AI, a push for sustainability in smartphones, and more churn in entertainment subscriptions, according to the annual tech predictions from accounting and consulting firm Deloitte. Deloitte Global said in its annual report that chips will remain in short supply next year, and some component lead times will stretch into 2023. That’s consistent with reports from chipmakers such as Intel and Nvidia. The company also expects increasing discussion around regulating artificial intelligence (AI) more systematically, with several proposals being made. Deloitte also said that 320 million consumer health and wearable wellness devices will ship worldwide in 2022. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! And more Wi-Fi 6 devices will ship in 2022 than 5G devices, to the tune of at least 2.5 billion Wi-Fi 6 devices versus roughly 1.5 billion 5G devices. And at least 150 million streaming video-on-demand services (SVOD) paid subscriptions will be canceled worldwide. As games consoles reach their 50th anniversary, the console market will generate a record $81 billion in 2022, up 10% from 2021. The Technology, Media & Telecommunications (TMT) 2022 Predictions report underscores how many of these trends are being driven by the global pandemic’s economic and societal shifts, resulting in an increasingly connected and multi-device world, fueling the world’s need for more chips, growth in connectivity, and entertainment options. “The pandemic increased the need to maintain connections, improve productivity and experience entertainment, with accelerated adoption from both consumers and businesses alike,” said Kevin Westcott, Deloitte’s U.S. TMT and global Telecommunications, Media and Entertainment (TME) practice leader, in a statement. “In 2022, we foresee these behaviors continuing to grow, but amid a backdrop of challenges. Supply chain woes, increasing regulatory issues, and changing media habits will be at the forefront of business leaders’ minds as these challenges impact their ability to meet market demands.” Surge in chip demand but supply crunch continues Deloitte predicts that many types of chips will still be in short supply during 2022, but it will be less severe than it was for most of 2021, and it will not affect all chips. The continuation of the chip shortage and its staying power boils down to a significant surge in demand, driven by digital transformation and accelerated by the pandemic, Deloitte said. Unsurprisingly, venture capital investment in semiconductors is taking off to fill that demand for new kinds of chips. Deloitte predicts that venture capital (VC) firms globally will invest more than $6 billion in semiconductor startup companies in 2022. That may only be 2% of the more than $300 billion overall VC investments expected for 2022, but it’s more than three times larger than it was every year between 2000 and 2016. Wi-Fi 6 outselling 5G devices Many countries have adopted 5G over the past two years, but Wi-Fi 6 devices are now quietly outselling 5G devices by a large margin and will likely continue to do so for the next few years. Deloitte predicts that more Wi-Fi 6 devices will ship in 2022 than 5G devices, to the tune of at least 2.5 billion Wi-Fi 6 devices versus roughly 1.5 billion 5G devices. The reason: Wi-Fi 6, just as much as 5G, has a significant role to play in the future of wireless connectivity—not just for consumers, but also for the enterprise. Meanwhile, Smartphones will hit an installed base of 4.5 billion units in 2022, making it by far the world’s most popular consumer electronics device. But those phones will generate 146 million tons of carbon dioxide or equivalent emissions (CO2e) in 2022. While this is less than half a percent of the 34 gigatons of total CO2e emitted globally in 2021, it is still worth trying to reduce. There is clear evidence the industry is making smartphones more sustainable, by reducing the need for unplanned replacement, offering software support for smartphones for longer and lengthier phone lifetimes, ultimately helping to reduce the environmental impact of smartphones. AI and managing sensitive data Deloitte predicts that 2022 will see a great deal of discussion around regulating AI more systematically, with several proposals being made—although enacting them into fully enforced regulations will not likely happen until 2023 or beyond. Some jurisdictions may even try to ban whole subfields of AI — such as facial recognition in public spaces, social scoring, and subliminal techniques — entirely. In addition, driven by the increasing urgency of safeguarding data used in AI applications, emerging privacy-enhancing technologies such as homomorphic encryption (HE) and federated learning (FL) will also experience dramatic growth. Already in use by leading technology companies today, the combined market for HE and FL will grow at double-digit rates in 2022 to more than US$250 million, and by 2025, this market is expected to top $500 million. “AI has tremendous promise, but we’re likely to see more scrutiny in 2022 as regulators look to better understand the privacy and data security implications of emerging AI applications, and implement strategies to protect consumers,” said Paul Silverglate, Deloitte’s U.S. technology sector leader, in a statement. “Tech companies find themselves at a convergence point where they can no longer leave ethical issues like this to fate. What’s needed is a holistic approach to address ethical responsibility; companies that take this approach, especially in newer areas like AI, can expect greater acceptance, more trust, and increased revenue.” As the world churns: The streaming wars go global As leading streaming providers expand globally, while national media companies spin up their own domestic streaming services, the amplified competition, is creating abundant consumer choice — and accelerating churn. In 2022, Deloitte predicts that at least 150 million streaming video-on-demand services (SVOD) paid subscriptions will be canceled worldwide, with churn rates of up to 30% per market. That’s the bad news. The better news is that, overall, more subscriptions will be added than canceled, the average number of subscriptions per person will rise, and, in markets with the highest churn, many of those canceling may resubscribe to a service that they had previously left. These are all signs of a competitive and maturing SVOD market. As SVOD matures, growth across global regions that may have different cost sensitivities will likely require different business model innovation and pathways to profitability. “One thing we’ve learned during the past year is that consumers want entertainment choices, in content, in cost and in their ability to connect socially through their experiences,” said Jana Arbanas, Deloitte’s U.S. Telecom, Media and Entertainment (TM&E) sector leader, in a statement. “As more global players enter these already competitive markets around the world, entertainment companies will be challenged to constantly innovate, be nimble in their actions and respond quickly to market changes in order to capture the minds and wallets of increasingly savvy consumers.” On another subject, Deloitte predicted continued focus on diversity, equity, and inclusion: Although the largest players in the technology industry are closing the gender gap and will reach 33% overall female representation in the workforce in 2022, women in technical roles continue to lag by 8% and the pandemic has caused increased churn with 57% of women in TMT expecting to change employers within two years and a startling 22% considering leaving the workforce citing workload increases impacting wellbeing. And sports nonfungible tokens (NFTs) have kicked sports memorabilia into the digital age. Deloitte predicts that blockchain-based NFTs for sports media will generate more than $2 billion in transactions in 2022, about double the figure for 2021. By the end of 2022, we expect that 4 to 5 million sports fans globally will have purchased or been gifted an NFT sports collectible. Interest in sports NFTs is likely to be spurred by activity in the wider NFT market, including digital art, the top five most valuable sales of which had generated over $100 million by August 2021. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,653
2,021
"CIBC Innovation Banking Provides $35 Million Debt Facility to Intradiem to Accelerate Growth | VentureBeat"
"https://venturebeat.com/business/cibc-innovation-banking-provides-35-million-debt-facility-to-intradiem-to-accelerate-growth"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release CIBC Innovation Banking Provides $35 Million Debt Facility to Intradiem to Accelerate Growth Share on Facebook Share on X Share on LinkedIn ATLANTA–(BUSINESS WIRE)–November 30, 2021– CIBC Innovation Banking is pleased to announce a debt facility of $35 million for Atlanta-based Intradiem, a leading provider of automation technology solutions for customer service teams. Intradiem will use the capital to expand its market growth and accelerate the launch of strategic products to deliver on its next generation platform and machine learning initiatives. “The CIBC Innovation Banking team has provided incredible support to our team at Intradiem,” said Bryan Jones, Intradiem’s Chief Financial Officer. “We’re grateful for their assistance which will enable our immediate market growth plans and accelerated product initiatives.” Intradiem’s AI-powered Intelligent Automation platform enables contact center and back-office teams to process and leverage millions of real-time data points to boost operational efficiency, agent engagement, and customer experiences. “Intradiem’s platform brings a unique, real-time data processing capability that is transforming customer service operations,” said Andy Kirk, Managing Director in CIBC Innovation Banking’s Atlanta office. “We’re excited to support their continued growth and global expansion with this funding.” Intradiem is also backed by MK Capital and JMI Equity. About CIBC Innovation Banking CIBC Innovation Banking delivers strategic advice, cash management and funding to innovation companies across North America, the UK, and select European countries at each stage of their business cycle, from start up to IPO and beyond. With offices in Atlanta, Austin, Boston, Chicago, Denver, London, Menlo Park, Montreal, New York, Reston, Toronto and Vancouver, the team has extensive experience and a strong, collaborative approach that extends across CIBC’s commercial banking and capital markets businesses in the U.S. and Canada. About Intradiem Intradiem provides Intelligent Automation solutions that help customer service teams boost productivity, enhance employee engagement, and improve the end-customer experience. Patented AI-powered technology processes the massive quantity of data generated by contact centers and back offices and takes immediate action to support both in-center and remote teams. Customers can count on a guaranteed return on their investment, with a 2X payback typical in the first year and 3-5X in subsequent years. Each year Intradiem powers more than one billion automated actions and helps customers save more than $100 million. View source version on businesswire.com: https://www.businesswire.com/news/home/20211130005092/en/ Josh Burleton, [email protected] , 416-304-2712 Melissa Spies, [email protected] , 678-356-3500 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,654
2,021
"AWS re:Invent: Faster chips, smarter AI, and developer tools grab the spotlight | VentureBeat"
"https://venturebeat.com/business/aws-reinvent-faster-chips-smarter-ai-and-developer-tools-grab-the-spotlight"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AWS re:Invent: Faster chips, smarter AI, and developer tools grab the spotlight Share on Facebook Share on X Share on LinkedIn This week, Amazon’s Web Services (AWS) kicked off its tenth re:Invent conference, an event where it typically announces the biggest changes in the cloud computing industry’s dominant platform. This year’s news includes faster chips, more aggressive artificial intelligence, more developer-friendly tools, and even a bit of quantum computing for those who want to explore its ever-growing potential. Amazon is working to lower costs by boosting the performance of its hardware. Their new generation of machines powered by the third generation of AMD’s EPYC processors, the M6a , is touted as offering a 35% boost in price/performance over the previous generation of M5a machines built with the second generation of the EPYC chips. They’ll be available in sizes that range from two virtual CPUs with 8GB of RAM (m6a.large) up to 192 virtual CPUs and 768GB of RAM (m6a.48xlarge). AWS also notes that the chips will boast “always-on memory encryption” and rely on faster custom circuitry for faster encryption and decryption. The feature is a nod to users who worry about sharing hardware in the cloud and, perhaps, exposing their data. Gravitron processors meet GPUs for the cloud The company is also rolling out the second generation of its ARM-based Gravitron processors and marrying them with a fast GPU, the NVIDIA T4G Tensor Core. These new machines, known as the G5g, also promise to offer lower prices for better performance. AWS estimates that some game streaming loads, for instance, will be 30% cheaper on these new chips, a better price point that may encourage more game developers to move their computation to the cloud. The GPU on the chips could also be attractive to machine learning scientists’ training models, who will also value the better performance. This price sensitivity is driving the development of tools that optimize hardware configuration and performance. A number of companies are marketing services that manage cloud instances and watch for over-provisioned machines. Amazon expanded its own Compute Optimizer tool to include more extensive metrics that can flag resources that aren’t being used efficiently. They’re also extending the historical record to three months so that peaks that may appear at the end of months or quarters will be detectable. Automating routine developer tasks In addition to addressing price-performance ratios, Amazon is looking to please developers by simplifying the process of building and running more complex websites. A number of the announcements focus on enhancing tools that automate many of the small tasks that take up developer resources. For instance, the new version of EventBridge, the service used to knit together websites by passing messages connected to events, Amazon says, is “directly wired” to the S3 data storage layer so changes to the data or some of the metadata associated with it will automatically trigger events. The new version also offers more enhanced filtering, which is designed to make it simpler to spin up smarter code. Developers who base their workloads on containers will find things a bit faster because AWS is building a pull-through cache for the public containers in the Elastic Container Registry. This will simplify and speed up the work of deploying code built on top of these public containers. Amazon also anticipates that it could improve security by providing a more trustworthy path for the code. AI spots security flaws There is also a greater emphasis on helping developers find the best way to use AWS. Code reviews, for instance, can now rely upon AIs trained to spot security leaks triggered when developers inadvertently include passwords or other secrets in publicly accessible locations. This new part of the AWS tool CodeGuru will catch some of the most embarrassing security lapses that have bedeviled companies using AWS in the past. The tool works with AWS’s own repository, CodeCommit, as well as other popular version-tracking locations like BitBucket or GitHub. AWS is also opening up its model version of a modern AWS app, the so-called “Well-Architected Framework.” Now, development teams will be able to add their own custom requirements as “lenses.” This will make it simpler for development teams to extend the AWS model to conform to their internal best practices. Eying IoT and quantum computing Finally, AWS is offering a chance to hit the fast-forward button and experiment with the next generation of technology. Their RoboRunner, first launched in 2018, lets users create simulations of robots working and exploring. Companies adding autonomy to their assembly lines and factories can test algorithms. At the conference, Amazon opened a new set of features that simulate not just single robots but fleets cooperating as they work together to finish a job. This new layer, called IoT RoboRunner, relies upon the TaskManager to organize the workflow that can be specified as Lambda functions. For those with an eye toward the deepest part of the future where quantum computers may dominate, AWS is expanding and simplifying its cloud quantum offering called Braket. Users can write quantum algorithms and rent time on quantum processors without long-term commitment. This week, AWS announced that this Braket service can now run quantum algorithms as hybrid jobs. After the software is created using a local simulator, it can be handed off to AWS, which will allocate time on a quantum processor and store the results in an S3 bucket. For now, there’s no integration with the cost-saving tools like Compute Optimizer, but if quantum computing grows more successful it’s certain to be announced at a future version of re:Invent. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,655
2,021
"3D Glass Solutions Closes $20 Million Series B1 Funding Round to Accelerate Its Revenue Growth Strategy | VentureBeat"
"https://venturebeat.com/business/3d-glass-solutions-closes-20-million-series-b1-funding-round-to-accelerate-its-revenue-growth-strategy"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release 3D Glass Solutions Closes $20 Million Series B1 Funding Round to Accelerate Its Revenue Growth Strategy Share on Facebook Share on X Share on LinkedIn Investment Round Led by Intel Capital Will Support Operations for High Volume Production ALBUQUERQUE, N.M.–(BUSINESS WIRE)–November 30, 2021– 3D Glass Solutions Inc. (3DGS), a leading innovator of glass-based three-dimensional passive radio frequency (RF) devices, today announced it has secured $20 million in Series B1 funding led by Intel Capital. CerraCap Ventures, Lockheed Martin Ventures and Nagase & Co. Ltd. were also in participation. This round of funding adds Intel Capital’s David Flanagan, Lockheed Martin’s Jeff Cunningham and Nagase’s Yoriyuki Yamashiro to 3DGS’ board of directors. “3DGS’ patented APEX ® glass-ceramic technology is designed for use in 5G, autonomous vehicles, military and defense, and big data applications. Intel Capital’s commitment to investments in companies enabling the edge, 5G and autonomy technology space underscores the impact that 3DGS can have on these markets,” said Mark Popovich, CEO and president of 3DGS. “Closing Series B1 is a monumental growth inflection point for 3DGS. The company has an aggressive revenue growth strategy planned out and this round will play an integral role in fueling it. We are continuing our investment in transitioning operations to higher volume production, in response to our customers’ demands.” The 3DGS process produces low RF material loss to minimize power consumption and high-performance integration for advanced electronics systems, resulting in passive components with optimized electrical performance and ultra-low transmission loss. In addition, it boasts an intermediate coefficient of thermal expansion to minimize in-process and final product warp, precise 3D structuring of Through Glass Vias for input and output signals, and a smooth surface that enables fine-line metallization to achieve high interconnect density. “The introduction and adoption of new communications standards like 5G have driven incredible levels of complexity and product demands into the RF front end,” said David Flanagan, vice president and senior managing director at Intel Capital. “3DGS’ advanced ceramic materials and manufacturing capabilities are ideally suited to address these high-performance requirements for RF components. We are excited to see the adoption of this important technology in 5G systems and devices starting in 2022.” “3DGS’ glass ceramic technology delivers a multitude of size and performance advantages that can address a variety of applications in the aerospace sector,” added Jeff Cunningham, investment manager at Lockheed Martin Ventures. “Lockheed Martin Ventures is excited to make this investment and to continue our relationship with 3DGS.” 3DGS will use the Series B1 funds to purchase high volume manufacturing equipment and expand its manufacturing cleanrooms. The company will also increase headcount in its production, internal sales management and engineering teams to support rapidly growing demand from its customers. About 3D Glass Solutions 3D Glass Solutions (3DGS) is a world-class expert on the fabrication of electronic packages and devices using photo-definable glass-ceramics. The company manufactures a wide variety of glass-based, system-in-package (SiP) devices and components using its patented low-loss photosensitive APEX® glass ceramic technology for applications in RF electronics and photonics used in automotive radar, IC electronics, medical, aerospace, defense, wireless infrastructure, mobile handset and IoT industries. 3DGS offers high-precision products with exceptional high-frequency and low-loss properties. 3DGS glass ceramic-based RF products can be combined with any number of designs or devices to create incredibly unique and valuable SiP products. The company has created foundational patent positions related to all photosensitive glass-ceramic materials and devices and owns the fundamental intellectual property for all four positions (materials, design, systems and manufacturing) related to glass-ceramic devices for the electronics packaging industry. 3DGS leverages its unique product solutions to provide device manufacturing and systems integration services for several standard and custom products. To learn more about 3DGS, visit www.3DGSinc.com. APEX® is a registered trademark of 3D Glass Solutions Inc. View source version on businesswire.com: https://www.businesswire.com/news/home/20211130005042/en/ Olivia Metcalfe Townsend Team [email protected] VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,656
2,021
"Amazon debuts IoT TwinMaker and FleetWise | VentureBeat"
"https://venturebeat.com/apps/amazon-debuts-iot-twinmaker-and-fleetwise"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon debuts IoT TwinMaker and FleetWise Share on Facebook Share on X Share on LinkedIn AWS CTO Werner Vogels onstage November 29, 2018 at re:Invent. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Amazon today announced the new Amazon Web Services (AWS) IoT TwinMaker, a service designed to make it easier for developers to create digital twins of real-time systems like buildings, factories, industrial equipment, and product lines. Alongside this, the company debuted AWS IoT FleetWise, an offering that makes it ostensibly easier and more cost-effective for automakers to collect, transform, and transfer vehicle data in the cloud in near-real-time. “Digital twin” approaches to simulation have gained currency in other domains. For instance, London-based SenSat helps clients in construction, mining, energy, and other industries create models of locations for projects they’re working on. GE offers technology that allows companies to model digital twins of actual machines and closely track performance. And Microsoft provides Azure Digital Twins and Project Bonsai, which model the relationships and interactions between people, places, and devices in simulated environments. With IoT TwinMaker, Amazon says that customers can leverage prebuilt connectors to data sources like equipment, sensors, video feeds, and business applications to automatically build knowledge graphs and 3D visualizations. IoT TwinMaker supplies dashboards to help visualize operational states and updates in real time, mapping out the relationships between data sources. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! To help developers create web-based apps for end-users, the IoT TwinMaker comes with a plugin for Amazon Managed Grafana, Amazon’s fully managed service for the visualization platform from Grafana Labs. Grafana’s apps can enable users to observe and interact with digital twins created using IoT TwinMaker. IoT FleetWise As for IoT FleetWise, it enables AWS customers to collect and standardize data across fleets of upwards of millions of vehicles. IoT FleetWise can apply intelligent filtering to extract only what’s needed from connected vehicles to reduce the volume of data being transferred. Moreover, it features tools that allow automakers to perform remote diagnostics, analyze fleet health, prevent safety issues, and improve autonomous driving systems. As Amazon explains in a press release : “Automakers start in the AWS management console by defining and modeling vehicle attributes (e.g., a two-door coupe) and the sensors associated with the car’s model, trim, and options (e.g., engine temperature, front-impact warning, etc.) for individual vehicle types or multiple vehicle types across their entire fleet. After vehicle modeling, automakers install the IoT FleetWise application on the vehicle gateway (an in-vehicle communications hub that monitors and collects data), so it can read, decode, and transmit information to and from AWS.” “The cloud is fundamentally changing [the automobile] industry, including how vehicles are designed and manufactured, the features they offer, how we drive,” AWS CEO Adam Selipsky said onstage at Amazon’s re:Invent 2021 conference. “[Automakers] are designing vehicles that are fused with software connected by sensors, and systems generating on [enormous] amounts of data.” Vehicle telematics — a method of monitoring and harvesting data from any moving asset, including cars and trucks — could be a boon for automakers in the coming years, not to mention service providers like Amazon. Monetizing onboard services could create $1.5 trillion, or 30% more, in additional revenue potential by 2030, according to McKinsey. One analysis found that even during the height of the pandemic, the demand for fleet management and telematics software has continued to grow at a rate of 10.6% and 9.9%, respectively. As Sudip Saha noted in Automotive World, the current health crisis has proven to be an opportunity to showcase the benefits of effective fleet management systems — especially in the context of the ecommerce boom. Businesses that delivered better when contactless and remote tracking of consignments was the need of the hour have largely fared better than their competitors. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,657
2,021
"LinkedIn and Intel tech leaders on the state of AI | VentureBeat"
"https://venturebeat.com/ai/linkedin-and-intel-tech-leaders-on-the-state-of-ai"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages LinkedIn and Intel tech leaders on the state of AI Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Disclosure: The author is the managing director of Connected Data World. AI is on a roll. Adoption is increasing across the board, and organizations are already seeing tangible benefits. However, the definition of what AI is and what it can do is up for grabs, and the investment required to make it work isn’t always easy to justify. Despite AI’s newfound practicality, there’s still a long way to go. Let’s take a tour through the past, present, and future of AI, and learn from leaders and innovators from LinkedIn, Intel Labs, and cutting-edge research institutes. Connecting data with duct tape at LinkedIn Mike Dillinger is the technical lead for Taxonomies and Ontologies at LinkedIn’s AI Division. He has a diverse background, ranging from academic research to consulting on translation technologies for Fortune 500 companies. For the last several years, he has been working with taxonomies at LinkedIn. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! LinkedIn relies heavily on taxonomies. As the de facto social network for professionals, launching a skill-building platform is a central piece in its strategy. Following CEO Ryan Roslanski’s statement, LinkedIn Learning Hub was recently announced , powered by the LinkedIn Skills Graph, dubbed “the world’s most comprehensive skills taxonomy. ” The Skills Graph includes more than 36,000 skills, more than 14 million job postings, and the largest professional network with more than 740 million members. It empowers LinkedIn users with richer skill development insights, personalized content, and community-based learning. For Dillinger, however, taxonomies may be overrated. In his upcoming keynote in Connected Data World 2021, Dillinger is expected to refer to taxonomies as the duct tape of connecting data. This alludes to Perl, the programming language that was often referred to as the duct tape of the internet. “Duct tape is good because it’s flexible and easy to use, but it tends to hide problems rather than fix them,” Dillinger said. A lot of effort goes into building taxonomies, making them correct and coherent, then getting sign-off from key stakeholders. But this is when problems start appearing. Key stakeholders such as product managers, taxonomists, users, and managers take turns punching holes in what was carefully constructed. They point out issues of coverage, accuracy, scalability, and communication. And they’re all right from their own point of view, Dillinger concedes. So the question is — what gives? Dillinger’s key thesis,, is that taxonomies are simply not very good as a tool for knowledge organization. That may sound surprising at first, but coming from someone like Dillinger, it carries significant weight. Dillinger goes a long way to elaborate on the issues with taxonomies, but perhaps more interestingly, he also provides hints for a way to alleviate those issues: “The good news is that we can do much better than taxonomies. In fact, we have to do much better. We’re building the foundations for a new generation of semantic technologies and artificial intelligence. We have to get it right,” says Dillinger. Dillinger goes on to talk about more reliable building blocks than taxonomies for AI. He cites concept catalogs, concept models, explicit relation concepts, more realistic epistemological assumptions, and next-generation knowledge graphs. It’s the next generation, Dillinger says, because today’s knowledge graphs do not always use concepts with explicit human-readable semantics. These have many advantages over taxonomies, and we need to work on people, processes, and tools levels to be able to get there. Thrill-K: Rethinking higher machine cognition The issue of knowledge organization is a central one for Gadi Singer as well. Singer is VP and director of Emergent AI at Intel Labs. With one technology after another, he has been pushing the leading edge of computing for the past four decades and has made key contributions to Intel’s computer architectures, hardware and software development, AI technologies, and more. Singer said he believes that the last decade has been phenomenal for AI, mostly because of deep learning, but there’s a next wave that is coming: a “third wave” of AI that is more cognitive, has a better understanding of the world, and higher intelligence. This is going to come about through a combination of components: “It’s going to have neural networks in it. It’s going to have symbolic representation and symbolic reasoning in it. And, of course, it’s going to be based on deep knowledge. And when we have it, the value that is provided to individuals and businesses will be redefined and much enhanced compared to even the great things that we can do today”, Singer says. In his upcoming keynote for Connected Data World 2021 , Singer will elaborate on Thrill-K, his architecture for rethinking knowledge layering and construction for higher machine cognition. Singer distinguishes recognition, as in the type of pattern-matching operation using shallow data and deep compute at which neural networks excel, from cognition. Cognition, Singer argues, requires understanding the very deep structure of knowledge. To be able to process even seemingly simple questions requires organizing an internal view of the world, comprehending the meaning of words in context, and reasoning on knowledge. And that’s precisely why even the more elaborate deep learning models we have currently, namely language models, are not a good match for deep knowledge. Language models contain statistical information, factual knowledge, and even some common sense knowledge. However, they were never designed to serve as a tool for knowledge organization. Singer believes there are some basic limitations in language models that make them good, but not great for the task. Singer said that what makes for a great knowledge model is the capability to scale well across five areas of capabilities: scalability, fidelity, adaptability, richness, and explainability. He adds that sometimes there’s so much information learned in language models, that we can extract it and enhance dedicated knowledge models. To translate the principles of having a great knowledge model to an actual architecture that can support the next wave of AI, Singer proposes an architecture for knowledge and information organized at three levels, which he calls Thrill-K. The first level is for the most immediate knowledge, which Singer calls the Giga scale, and believes should sit in a neural network. The next level of knowledge is the deep knowledge base, such as a knowledge graph. This is where intelligible, structured, explicit knowledge is stored at the Terascale, available on demand for the neural network. And, finally, there’s the world information and the world knowledge level, where data is stored at the Zetta scale. Knowledge, Singer argues, is the basis for making reasoned intelligent decisions. It can adapt to new circumstances and new tasks. That’s because the data and the knowledge are not structured for a particular task, but it’s there with all their richness and expressivity. It will take concerted effort to get there, and Intel Labs on its part is looking into aspects of NLP, multi-modality, common sense reasoning, and neuromorphic computing. Systems that learn and reason If knowledge organization is something that both Dillinger and Singer value as a key component in an overarching framework for AI, for Frank van Harmelen it’s the centerfold in his entire career. Van Harmelen leads the Knowledge Representation & Reasoning Group in the Computer Science Department of the VU University Amsterdam. He is also Principal investigator of the Hybrid Intelligence Centre , a $22.7 million, (€20 million), ten-year collaboration between researchers at six Dutch universities into AI that collaborates with people instead of replacing them. Van Harmelen notes that after the breakthroughs of machine learning (deep learning or otherwise) in the past decade, the shortcomings of machine learning are also becoming increasingly clear: unexplainable results, data hunger, and limited generalisability are all becoming bottlenecks. In his upcoming keynote in Connected Data World 2021 , Van Harmelen will look at how the combination with symbolic AIin the form of very large knowledge graphs can give us a way forward: Towards machine learning systems that can explain their results, that need less data, and that generalize better outside their training set. The emphasis in modern AI is less on replacing people with AI systems, and more on AI systems that collaborate with people and support them. For Van Harmelen, however, it’s clear that current AI systems lack background knowledge, contextual knowledge, and the capability to explain themselves, which makes them not very human-centered: “They can’t support people and they can’t be competent partners. So what’s holding AI back? Why are we in this situation? For a long time, AI researchers have locked themselves into one of two towers. In the case of AI, we could call these the symbolic AI tower and the statistical AI tower”. If you’re in the statistical AI camp, you build your neural networks and machine learning programs. If you’re in the symbolic AI camp, you build knowledge bases and knowledge graphs and you do inference over them. Either way, you don’t need to talk to people in the other camp, because they’re wrong anyway. What’s actually wrong, argues Van Harmelen, is this division. Our brains work in both ways, so there’s no reason why approximating them with AI should rely exclusively on either approach. In fact, those approaches complement each other very well in terms of strengths and weaknesses. Symbolic AI, most famously knowledge graphs, is expensive to build and maintain as it requires manual effort. Statistical AI, most famously deep learning, requires lots of data, plus oftentimes also lots of effort. They both suffer from the “performance cliff” issue (, i.e. their performance drops under certain circumstances, but the circumstances and the way differ). Van Harmelen provides many examples of practical ways in which symbolic and statistical AI can complement each other. Machine learning can help build and maintain knowledge graphs, and knowledge graphs can provide context to improve machine learning: “It is no longer true that symbolic knowledge is expensive and we cannot obtain it all. Very large knowledge graphs are witness to the fact that this symbolic knowledge is very well available, so it is no longer necessary to learn what we already know. We can inject what we already know into our machine learning systems, and by combining these two types of systems produce more robust, more efficient, and more explainable systems,” says Van Harmelen. The pendulum has been swinging back and forth between symbolic and statistical AI for decades now. Perhaps it’s a good time for the two camps to reconcile and start a conversation. To build AI for the real world, we’ll have to connect more than data. We’ll also have to connect people and ideas. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,658
2,021
"How Hiya taps AI to kill phone spam | VentureBeat"
"https://venturebeat.com/ai/how-hiya-taps-ai-to-kill-phone-spam"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How Hiya taps AI to kill phone spam Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Have you noticed that you’re getting more calls correctly identified as spam on your phones? Well, Hiya probably has something to do with that. The Seattle, Washington-based startup, with major clients in telecoms, is using artificial intelligence to detect 20% more illegal and unwanted calls than existing technologies currently do, CEO and founder Alex Algard told VentureBeat. The company last week introduced what it calls adaptive AI as an addition to its Hiya Protect product, which is used by wireless carriers, smartphone makers, and app developers as part of its service packages. It’s available in services such as AT&T Call Protect , Samsung Smart Call , and the Hiya app. Algard said the new technology is informed by live data streams from carriers, devices, and apps. “Adaptive AI observes the patterns left by spammers in the network traffic and adapts in real-time to block them without the need for human retraining or historical data,” he said. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The company claims its new capability is much more effective than conventional tactics that only react to known phone numbers used by spammers. The AI adaptivity comes into play when spammers change numbers or carriers, which Algard said happens constantly. How much phone spam is there? To quantify the scale of the phone spam, Hiya, which has roughly 200 million, active users, through its carrier clients, offered these statistics: More than 50 billion spam calls are made to Americans each year (16 per month per user) Hiya analyzes more than 13 billion calls per month 94% of unidentified calls go unanswered About one-third of Americans lose money to phone scams each year. On average, each victim lost $182 to phone scams last year. This means Americans collectively lost about $14 billion to scam calls in 2020. The most common ways scammers make money is by stealing personal information, selling fake products, services, or gaining access to financial accounts. An increasing number of spammers are deploying illegal tactics to generate business leads for legitimate or illegitimate businesses, such as car or computer warranty calls. Algard said he started Hiya in 2016 as a spin-out from the previous company he founded, WhitePages.com. “WhitePages is a directory service site. We identified some potential use cases that we thought we could build an incubator business around — basically, a caller ID service on the old landlines,” Algard said. “We thought it was odd that on mobile devices, there was no caller ID. So we figured that with the advent of mobile apps, we could actually solve that use case with an automated caller ID service for people who just download the app that we provided. And that turned out to get a lot of consumer interest; tons of people downloaded the app.” How Hiya puts AI to work Alex Algard shared the following additional insights in an interview with VentureBeat regarding how technologists, data architects, and software developers can use adaptive AI. VentureBeat: What AI and ML tools are you using specifically? Algard: Hiya has unique needs in developing models that can handle the challenges that the scale and volume of voice networks pose. The primary workload is the call analysis load, which must run in real time on live data streams, must be very low latency, and high throughput; fast enough to analyze calls as they are being made; and scale to analyze over 1 billion API calls per day. This primary workflow is supported by our proprietary Hiya MLOps system that we’ve fine-tuned to our problem. It includes internal ML-model lifecycle management and an ensemble-based prediction system to capture the many telecom scammer scenarios and geographies that we deal with to provide global call protection. For other workloads, we pull from numerous ML platforms as needed. For example, we use Sagemaker to create, train, and deploy systems that look at a robocall’s network characteristics and analyze recordings. VentureBeat: Are you using models and algorithms out of a box — for example, from DataRobot or other sources? Algard: Because of the unique challenges of live data streams and the scale of the networks we run on, we are building and maintaining our own custom frameworks. Out-of-the-box or auto-ML solutions haven’t proven to be a viable solution for the size and scale of the issues we’re tackling. VentureBeat: What cloud service are you using mainly? Algard: We use AWS and are expanding to support Microsoft Azure. VentureBeat: Are you using a lot of the AI workflow tools that come with that cloud? Algard: We use underlying AWS services such as EC2 and DynamoDB for computing, data storage, and global synchronization. And for data post-processing and data prep, we use tools from multiple sources: AWS Glue, Apache Airflow, Zeppelin, Jupyter, etc. VentureBeat: How much do you do yourselves? Algard: Quite a lot. Scammers and illegal callers are sophisticated and constantly changing tactics to avoid detection. We’ve invested in a dedicated team of data scientists that focus on the illegal caller industry and are constantly iterating and adjusting our AI model engine to keep pace with them. Many of the models we employ are on their fifth or sixth generation as we refine them to take on specific scammer tactics. We are active in the AI/ML community and make use of the latest technologies and approaches when we can, but often we have to develop new approaches on our own. Adaptive AI is an example of an approach that we’ve had to develop in-house. VentureBeat: How are you labeling data for the ML and AI workflows? Algard: Data labeling is the most important aspect of what we do that makes Hiya so effective at defeating illegal callers globally. We’ve made the investment to do this in-house because of its impact on our accuracy. We use data from several sources, including call event data from the Hiya network, scam traps, user reports, federal compliance data, STIR/SHAKEN, and custom data sources from our carrier and distribution partners. VentureBeat: Can you give us a ballpark estimate on how much data you are processing? Algard: Hiya deals with an incredible amount of data: 200M users worldwide, 450,000 ML models recalculations per second, and 20GB/hour of ML model changes pushed to our edge service. Our model recalculation requires the biggest AWS EC2 instance available. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,659
2,021
"Google and Qualcomm collaborate to accelerate AI development | VentureBeat"
"https://venturebeat.com/ai/google-and-qualcomm-collaborate-to-accelerate-ai-development"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Google and Qualcomm collaborate to accelerate AI development Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Qualcomm today at its Snapdragon Summit 2021 announced a collaboration with Google Cloud to bring the latter’s Neural Architecture Search to Qualcomm platforms. The move is designed to speed up the development of AI models at the edge. Qualcomm claims the announcement will make it the first system-on-a-chip (SoC) customer to offer the Google Cloud Vertex AI Neural Architecture Search services. It will first be available on the Snapdragon 8, Gen 1 Mobile Platform, followed by the Snapdragon portfolio across mobile, IoT, automotive, and XR platforms. As AI/ML hardware has become more widespread, attention has turned to the software stack, which often consists of point solutions. Optimizing develops MLOps workflows for AI and with this collaboration, Qualcomm aims to speed up the development of AI models for Snapdragon at the edge. Google Cloud announced Vertex AI Neural Architecture Search in May as a unified platform for developing, deploying, and maintaining AI models. According to Google, training models with Vertex AI required almost 80% fewer lines of code compared to other platforms. Google claims it’s the same toolkit that is used internally to power Google, ranging from computing vision to language and structured data. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Vertex AI consists of various tools, but Qualcomm specifically called out the Neural Architecture Search. As the name implies, it seeks to optimize AI models. Vertex AI NAS will be integrated into the Qualcomm Neural Processing SDK, and will run on the Qualcomm AI Engine. “With this collaboration, Qualcomm Technologies will now be able to build and optimize new AI models in weeks rather than months, and we’re thrilled at the impact this will have on people using Snapdragon-powered devices,” June Yang, vice president of Cloud AI and Industry Solutions at Google Cloud, said in a statement. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,660
2,021
"Amazon launches SageMaker Canvas for no-code AI model development | VentureBeat"
"https://venturebeat.com/ai/amazon-launches-sagemaker-canvas-for-no-code-ai-model-development"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon launches SageMaker Canvas for no-code AI model development Share on Facebook Share on X Share on LinkedIn The Amazon logo is seen at the Young Entrepreneurs fair in Paris Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. During a keynote address today at its re:Invent 2021 conference, Amazon announced SageMaker Canvas, which enables users to create machine learning models without having to write any code. Using SageMaker Canvas, Amazon Web Services (AWS) customers can run a machine learning workflow with a point-and-click user interface to generate predictions and publish the results. Low- and no-code platforms allow developers and non-developers alike to create software through visual dashboards instead of traditional programming. Adoption is on the rise, with a recent OutSystems report showing that 41% of organizations were using a low- or no-code tool in 2019/2020, up from 34% in 2018/2019. “Now, business users and analysts can use Canvas to generate highly accurate predictions using an intuitive, easy-to-use interface,” AWS CEO Adam Selipsky said onstage. “Canvas uses terminology and visualizations already familiar to [users] and complements the data analysis tools that [people are] already using.” AI without code With Canvas, Selipsky says that customers can browse and access petabytes of data from both cloud and on-premises data sources, such as Amazon S3, Redshift databases, as well as local files. Canvas uses automated machine learning technology to create models, and once the models are created, users can explain and interpret the models and share the models with each other to collaborate and enrich insights. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “With Canvas, we’re making it even easier to prepare and gather data for machine learning to train models faster and expand machine learning to an even broader audience,” Selipsky added. “It’s really going to enable a whole new group of users to leverage their data and to use machine learning to create new business insights.” Canvas follows on the heels of SageMaker improvements released earlier in the year, including Data Wrangler, Feature Store, and Pipelines. Data Wrangler recommends transformations based on data in a target dataset and applies these transformations to features. Feature Store acts as a storage component for features and can access features in either batches or subsets. As for Pipelines, it allows users to define, share, and reuse each step of an end-to-end machine learning workflow with preconfigured customizable workflow templates while logging each step in SageMaker Experiments. With upwards of 82% of firms saying that custom app development outside of IT is important, Gartner predicts that 65% of all apps will be created using low- and no-code platforms like Canvas by 2024. Another study reports that 85% of 500 engineering leaders think that low- and no-code will be commonplace within their organizations as soon as 2021. If the current trend holds, the market for low- and no-code could climb to between $13.3 billion and $17.7 billion in 2021 and between $58.8 billion and $125.4 billion in 2027. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,661
2,021
"Amazon announces Graviton3 processors for AI inferencing | VentureBeat"
"https://venturebeat.com/ai/amazon-announces-graviton3-processors-for-ai-inferencing"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon announces Graviton3 processors for AI inferencing Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. At its re:Invent 2021 conference today, Amazon announced Graviton3, the next generation of its custom ARM-based chip for AI inferencing applications. Soon to be available in Amazon Web Services (AWS) C7g instances, the company says that the processors are optimized for workloads including high-performance compute, batch processing, media encoding, scientific modeling, ad serving, and distributed analytics. Alongside Graviton3, Amazon unveiled Trn1, a new instance for training deep learning models in the cloud — including models for apps like image recognition , natural language processing , fraud detection, and forecasting. It’s powered by Trainium , an Amazon-designed chip that the company last year claimed would offer the most teraflops of any machine learning instance in the cloud. (A teraflop translates to a chip being able to process 1 trillion calculations per second.) As companies face pandemic headwinds including worker shortages and supply chain disruptions, they’re increasingly turning to AI for efficiency gains. According to a recent Algorithmia survey, 50% of enterprises plan to spend more on AI and machine learning in 2021, with 20% saying they will be “significantly” increasing their budgets for AI and ML. AI adoption is, in turn, driving cloud growth — a trend of which Amazon is acutely aware, hence the continued investments in technologies like Graviton3 and Trn1. Graviton3 AWS CEO Adam Selipsky says that Graviton3 is up to 25% faster for general-compute workload and provides two times faster floating-point performance for scientific workloads, two times faster performance for cryptographic workloads, and three times faster performance for machine learning workloads versus Graviton2. Moreover, Graviton3 uses up to 60% less energy for the same performance compared with the previous generation, Selipsky claims. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Graviton3 also includes a new pointer authentication feature that’s designed to improve overall security. Before return addresses are pushed onto the stack, they’re first signed with a secret key and additional context information, including the current value of the stack pointer. When the signed addresses are popped off the stack, they’re validated before being used. An exception is raised if the address isn’t valid, blocking attacks that work by overwriting the stack contents with the address of harmful code. As with previous generations, Graviton3 processors include dedicated cores and caches for each virtual CPU, along with cloud-based security features. C7g instances will be available in multiple sizes, including bare metal, and Amazon claims that they’re the first in the cloud industry to be equipped with DDR5 memory, up to 30Gbps of network bandwidth, and elastic fabric adapter support. Trn1 According to Selipsky, Trn1, Amazon’s instance for machine learning training, delivers up to 800Gbps of networking and bandwidth, making it well-suited for large-scale, multi-node distributed training use cases. Customers can leverage up to tens of thousands of clusters of Trn1 instances for training models containing upwards of trillions of parameters. Trn1 supports popular frameworks including Google’s TensorFlow, Facebook’s PyTorch, and MxNet and uses the same Neuron SDK as Inferentia , the company’s cloud-hosted chip for machine learning inference. Amazon is quoting 30% higher throughput and 45% lower cost-per-inference compared with the standard AWS GPU instances. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,662
2,021
"Zendesk acquires Momentive to boost its customer analytics offerings | VentureBeat"
"https://venturebeat.com/uncategorized/zendesk-acquires-momentive-to-boost-its-customer-analytics-offerings"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Zendesk acquires Momentive to boost its customer analytics offerings Share on Facebook Share on X Share on LinkedIn Zendesk cofounder and CEO Mikkel Svane at his company's Relate customer conference in San Francisco, Calif on May 11, 2016. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Zendesk today announced that it’s entered into an agreement to acquire Momentive , including the latter’s popular SurveyMonkey platform. According to Zendesk CEO Mikkel Svane, the purchase will enable Zendesk customers to “build more meaningful relationships” by providing opportunities for Momentive and Zendesk to cross-sell and co-develop existing and future products, potentially driving Zendesk’s 2024 revenue to $3.5 billion. “The SurveyMonkey brand is iconic, and we’ve admired their business from afar since the inception of Zendesk. They truly democratized an industry — almost everyone in the world has responded to one of their surveys at some point,” Svane said in a statement. “We’re very excited to have them join the Zendesk mission along with Momentive’s market research and insights products and together create a powerful new customer intelligence company. We will deliver a rich, colorful picture of every customer, so businesses really understand their customers and can build more authentic relationships.” Launched in 1999, Momentive — formerly SurveyMonkey — offers cloud-based software to support service solutions across brand and market insights; product, employee, and customer experiences; online survey development; and a suite of paid backend programs. Founded by Ryan Finley and Chris Finley, Spectrum Equity and Bain Capital acquired a majority interest in the company in 2009. Beginning in 2013, Momentive significantly expanded its operations, announcing HIPAA-compliant features for premium subscription holders and SurveyMonkey Genius, which estimates survey performance and makes suggestions to increase their effectiveness. 2017 marked the debut of another new product, SurveyMonkey CX, aimed at assisting organizations in managing their customer experience programs. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! After acquiring a number of startup businesses — Precision Polling, Wufoo, Zoomerang, Fluidware, TechValidate, Usabilla, GetFeedback, and a 49.9% stake in the UK-based Clicktools — Momentive when public under the name SurveyMonkey in 2018. In 2021, it announced its rebranding to Momentive to encompass its SurveyMonkey, GetFeedback, and Momentive Insights services. As of October 2018, Momentive’s revenue stood at $114.8 million. The company, which has 20 million users across 345,000 paying companies, has an estimated market cap of $3.61 billion. “We look forward to combining with Zendesk to advance our mission and accelerate our long-term growth strategy,” Momentive CEO Zander Lurie, who will continue to lead Momentive following the acquisition, said in a press release. “This is a testament to the strength of our agile products and talented team. Zendesk and Momentive share a culture centered around our people, our communities, and the customers we serve. The synergies between our companies are proximate and compelling. We are uniquely positioned to make customer intelligence a reality ​​while delivering significant value for our shareholders.” Customer experience research Zendesk, which was founded in Copenhagen, Denmark in 2007, went public in 2014 after raising about $86 million in venture capital investments. In recent years, it has leaned heavily into automation, acquiring customer service automation startup Cleverly.ai and introducing a chatbot that has conversations with customers and attempts to help them find useful information, with algorithms that better predict answers. In its vision for the Momentive acquisition, Zendesk says that the combined platforms will collect “critical information about customer needs, experiences, and expectations” while helping companies “bring a customer into focus” by “combining transactional data with market research and insights.” It’ll also enable teams to “take action with the full breadth of data about their customers,” Zendesk says, as well as “feedback and market insights.” For Zendesk, which offers a platform that connects brands with customers over voice, chat, email, messaging, and social channels, the purchase appears to be a play for a larger slice of the expanding customer analytics market. According to Verified Market Research, the global customer analytics market was valued at $5.24 billion in 2020 and is projected to reach $20.82 Billion by 2028, growing at a compound annual growth rate of 19.30% from 2021 to 2028. At its core, customer analytics is the process by which data from customer behavior is used to help make key business decisions via market segmentation and predictive analytics. Enterprises use this information for direct marketing, site selection, and customer relationship management. A recent McKinsey survey found that, among companies adopting customer analytics solutions, 50% are likely to have sales “well above” their competitors’ — versus only 22% of the laggards. “A third of all survey participants rated customer analytics as extremely important for business success, positioning it among the top five drivers of their marketing,” the coauthors of the McKinsey survey wrote. “They consider it as important as price and product management, only a few percentage points below service and actions to enhance customer experience, and far ahead of the management of advertising campaigns — which only 20% view as a key driver of success.” Zendesk notes that the Momentive transaction, which is anticipated to close in the first half of 2022, is subject to review by regulators in addition to Zendesk and Momentive stockholders. The boards of directors of both companies Momentive approved the deal, which will give Zendesk shareholders approximately 78% of the combined company and Momentive stockholders 22% (a ratio valuing Momentive stock at $28 per outstanding share). VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,663
2,021
"Microsoft acquires AI-powered moderation platform Two Hat | VentureBeat"
"https://venturebeat.com/uncategorized/microsoft-acquires-ai-powered-moderation-platform-two-hat"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft acquires AI-powered moderation platform Two Hat Share on Facebook Share on X Share on LinkedIn View of a Microsoft logo on March 10, 2021, in New York. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Microsoft today announced that it acquired Two Hat , an AI -powered content moderation platform, for an undisclosed amount. According to Xbox product services CVP Dave McCarthy, the purchase will combine the technology, research capabilities, teams, and cloud infrastructure of both companies to serve Two Hat’s existing and new customers and “multiple product and service experiences: at Microsoft. “Working with the diverse and experienced team at Two Hat over the years, it has become clear that we are fully aligned with the core values inspired by the vision of founder, Chris c, to deliver a holistic approach for positive and thriving online communities,” McCarthy said in a blog post. “For the past few years, Microsoft and Two Hat have worked together to implement proactive moderation technology into gaming and non-gaming experiences to detect and remove harmful content before it ever reaches members of our communities.” Moderation According to the Pew Research Center , 4 in 10 Americans have personally experienced some form of online harassment. Moreover, 37% of U.S.-based internet users say they’ve been the target of severe attacks — including sexual harassment and stalking — based on their sexual orientation, religion, race, ethnicity, gender identity, or disability. Children, in particular, are the subject of online abuse, with one survey finding a 70% increase in cyberbullying on social media and gaming platforms during the pandemic. Priebe founded Two Hat in 2012 when he left his position as a senior app security specialist at Disney Interactive, Disney’s game development division. A former lead developer on the safety and security team for Club Penguin, Priebe was driven by a desire to tackle the issues of cyberbullying and harassment on the social web. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Today, Two Hat claims its content moderation platform — which combines AI, linguistics, and “industry-leading management best practices” — classifies, filters, and escalates more than a trillion human interactions including messages, usernames, images, and videos a month. The company also works with Canadian law enforcement to train AI to detect new child exploitative material, such as content likely to be pornographic. “With an emphasis on surfacing online harms including cyberbullying, abuse, hate speech, violent threats, and child exploitation, we enable clients across a variety of social networks across the globe to foster safe and healthy user experiences for all ages,” Two Hat writes on its website. Microsoft partnership Several years ago, Two Hat partnered with Microsoft’s Xbox team to apply its moderation technology to communities in Xbox, Minecraft, and MSN. Two Hat’s platform allows users to decide the content they’re comfortable seeing — and what they aren’t — which Priebe believes is a key differentiator compared with AI-powered moderation solutions like Sentropy and Jigsaw Labs’ Perspective API. “We created one of the most adaptive, responsive, comprehensive community management solutions available and found exciting ways to combine the best technology with unique insights,” Priebe said in a press release. “As a result, we’re now entrusted with aiding online interactions for many of the world’s largest communities.” It’s worth noting that semi-automated moderation remains an unsolved challenge. Last year, researchers showed that Perceive, a tool developed by Google and its subsidiary Jigsaw, often classified online comments written in the African American vernacular as toxic. A separate study revealed that bad grammar and awkward spelling — like “Ihateyou love,” instead of “I hate you,” — make toxic content far more difficult for AI and machine detectors to spot. As evidenced by competitions like the Fake News Challenge and Facebook’s Hateful Memes Challenge , machine learning algorithms also still struggle to gain a holistic understanding of words in context. Revealingly, Facebook admitted that it hasn’t been able to train a model to find new instances of a specific category of disinformation: misleading news about COVID-19. And Instagram’s automated moderation system once disabled Black members 50% more often than white users. But McCarthy expressed confidence in the power of Two Hat’s product, which includes a user reputation system, supports 20 languages, and can automatically suspend, ban, and mute potentially abusive members of communities. “We understand the complex challenges organizations face today when striving to effectively moderate online communities. In our ever-changing digital world, there is an urgent need for moderation solutions that can manage online content in an effective and scalable way,” he said. “We’ve witnessed the impact they’ve had within Xbox, and we are thrilled that this acquisition will further accelerate our first-party content moderation solutions across gaming, within a broad range of Microsoft consumer services, and to build greater opportunity for our third-party partners and Two Hat’s existing clients’ use of these solutions.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,664
2,021
"Texas tech workers targeted by Bay Area firms over abortion ban | VentureBeat"
"https://venturebeat.com/social/texas-tech-workers-targeted-by-bay-area-firms-over-abortion-ban"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Texas tech workers targeted by Bay Area firms over abortion ban Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. After the 2020 mass exodus of tech workers leaving pricey California toward states with a lower cost of living, a new state is draining brains. This time, it’s not the rent that’s pushing them out. That sound you’re hearing is the potential vacuum as Texas tech workers rush to abandon Texas-based companies after the state’s controversial abortion ban. Throughout the month of November, ten California-based public relations agencies will host a virtual job fair for professionals hoping to relocate from Texas. San Francisco-based public relations agency Bospar hopes to lure workers back to California after the pandemic pushed many toward greener, cheaper pastures. As the tech industry deals with the massive dearth of data management and engineer talent, leaders are realizing that staff are driven by more than simply financial motivation. Quality of life, personal autonomy, climate, and health and wellness are top reasons individuals look to change employers and geographical locations alike. Additionally, younger workers are far less likely to stay put and tolerate less-than-desirable working conditions. As tech tackles its own diversity and inclusivity issues, championing a subject like reproductive rights signals a definite tide shift toward those goals. Texas tech workers wooed out of state “Texas has shown a tragic disregard for the health, safety, and constitutional rights of women,” said Carol Carrubba, principal of Highwire PR, one of the participating agencies in the virtual job fair. “We are determined to offer new options and a safe haven for those professionals who want to leave the state.” Carrubba’s company cites clients such as Akamai, IBM, Udemy, AppDynamics, Twilio, Oath, Cloudera, and a host of other venerable Silicon Valley leaders. While anyone who has tried to park in San Francisco may view the incoming throng of out-of-state transplants with dismay, in truth, the Bay Area has been losing residents at an alarming pace. In 2020, California saw its first population decline in the entire history of its statehood, primarily decreasing in the coastal areas where the cost of living and population density is highest. San Francisco alone was losing residents to the tune of about 7,000 households a month during the height of the pandemic. The decline in Silicon Valley has slowed considerably as workers are coaxed out of their home offices, most had relocated out of the Bay Area while not leaving the actual state of California. Taking back tech brains While the pandemic isn’t the only reason that workers chose to leave the Golden State, some find that their own feelings about human and health rights are more valuable than generous housing availability. The Texas abortion ban is galvanizing Bay Area professionals to make a call for action. “As an openly gay PR agency CEO with a majority of female staff, I feel an obligation to join my fellow agency owners in this effort to protect the hard-earned rights for women to control their own bodies,” said Fred Bateman, founder, and CEO of Bateman Agency, whose list of past and current clients includes Atlassian, Unbounce, Heroku, Snowflake, Sumologic and more. Bospar, the organizing agency, services primarily tech companies such as PayPal, Polycomm, Logitech, Unisys, Cambium Networks, and Yellowbricks. The ten agencies serving tech companies joining forces around this political conversation include Bateman Agency, BOCA, Bospar, EvolveMKD, Highwire PR, Karbo Communications, Manhattan Strategies, Redwood Climate Communications, Strange Brew Strategies, and Trier and Company. “The Republican-controlled government in Texas will not be satisfied with just restricting access to reproductive services. This is just the first battle, which must be stopped, or the war on progress will be lost,” Bateman said. Is this a moment or a movement? A 2021 report from Capgemini revealed that there is a stark disparity between how executives and staff view the inclusion and diversity support of their organizations. While an overwhelming majority (85%) of executives believed their organizations offered equitable opportunities to every employee, only 18% of employees agreed. This comes on the heels of compelling research that female tech workers, particularly women of color, frequently feel invisible at work, and only 16% of women feel well represented on their tech teams. The virtual job fair targeting Texas tech workers will run for one month, beginning Thursday, November 4. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,665
2,021
"Security AI is the next big thing | VentureBeat"
"https://venturebeat.com/security/security-ai-is-the-next-big-thing"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Security AI is the next big thing Share on Facebook Share on X Share on LinkedIn Jask's ASOC cybersecurity user interface takes inspiration from video game mini maps. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. In the world of cybersecurity, speed kills. In less than 20 minutes, a skilled adversary can break into an organization’s network and start exfiltrating critical data assets, and as the volume of data modern companies produce increases, it’s becoming ever more difficult for human analysts to spot malicious activity until it’s too late. This is where cybersecurity AI can come to the rescue. This hostile threat landscape has led organizations such as Microsoft to use AI as part of their internal and external cybersecurity strategy. “We’re seeing this incredible increase in the volume of attacks, from human-operated ransomware through all different kinds of zero-day attacks,” said Ann Johnson, corporate vice president of security, compliance, and identity at Microsoft. Given the complexity of modern attacks, “there is absolutely no way that human defenders can keep up with it, so we must have artificial intelligence capabilities in the technologies and solutions we’re providing,” Johnson said. For modern organizations, AI is now vital for keeping up with the fast-moving threat landscape and offers a variety of use cases that enterprises can leverage to improve their security posture. Shutting down attacks early with IR Perhaps the most compelling use case for AI in cybersecurity is incident response. AI enables organizations to automatically detect anomalous behavior within their environments and conduct automated responses to contain intrusions as quickly as possible. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! One of the most high-profile uses of AI this year occurred at the Olympic Games in Tokyo, when Darktrace AI identified a malicious Raspberry Pi IoT device that an intruder had planted into the office of a national sporting body directly involved in the Olympics. The solution detected the device port scanning nearby devices, blocked the connections, and supplied human analysts with insights into the scanning activity so they could investigate further. “Darktrace was able to weed out that there was something new in the environment that was displaying interesting behavior,” Darktrace’s chief information security officer (CISO) Mike Beck said. Beck noted there was a distinct change in behavior in terms of the communication profiles that exist inside that environment. When considering the amount of data the national body was processing in the run-up to the Olympics, it would have been impossible for a human analyst to spot such an attack at the same speed as the AI, Beck said. “In 2021, and going forward, there is too much digital data. That is the raw reality,” Beck said. “You have to be using intelligent AI to find these attacks, and if you don’t, there’s going to be a long period of dwell time, and those attackers are going to have free rein.” Charting and labeling protected data Keeping up with the latest threats isn’t the only compelling use case that AI has within cybersecurity. AI also offers the ability to automatically process and categorize protected data so that organizations can have complete transparency over how they process this data; it also ensures that they remain compliant with data privacy regulations within an ever-more-complex regulatory landscape. “Our regulatory department tells me we evaluate 250 new regulations daily across the world to see what we need to be in compliance, so then take all of that and think about all the different laws that are being passed in different countries around data; you need machine-learning capabilities,” Johnson said. In practice, Johnson said, that means “using a lot of artificial intelligence and machine learning to understand what the data actually is and to make sure we have the commonality of labeling, to make sure we understand where the data is transiting,” a task too monumental for even the largest team of security analysts. “It’s up to AI to decide: Is this a U.S. Social Security number, or just [nine] characters that are something else?” Johnson said. By categorizing and labeling sensitive data, AI makes it easier for an organization to take inventory of what protected information is transiting where, so admins can accurately report to regulators on how that data is handled and prevent exposure to unauthorized individuals. Building zero-trust architectures At the same time, the ability to build automated zero-trust architectures and to ensure that only authorized users and devices have access to privileged information is emerging as one of the most novel use cases of AI. AI-driven authentication can ensure that nobody except authorized users has access to sensitive information. As Ann Cleaveland, executive director of the Center for Long-Term Cybersecurity at UC Berkeley, explained, “One of the most powerful emerging use cases is the implementation of so-called zero-trust architectures and continuous or just-in-time authentication of users on the system and verification of devices.” Zero-trust AI systems leverage a range of data points to identify and authenticate authorized users at machine speed accurately. “These systems are underpinned by machine-learning models that take time, location, behavior data, and other factors to assign a risk score that is used to grant or deny access,” Cleaveland said. When utilized correctly, these solutions can detect when unauthorized individual attempts to access privileged information and block the connection. Cleaveland said that these capabilities are becoming more important following the mass shift to remote or hybrid work environments that have taken place throughout the COVID-19 pandemic. Bridging the skills gap with automation One of the main drivers of adoption for some organizations is AI’s ability to bridge the IT skills gap by enabling in-house security teams to do more with less through the use of automation. AI can automatically complete tedious manual tasks, such as processing false-positive alerts so that analysts have a more manageable workload and additional time to focus on more productive and rewarding high-level tasks. “We’ve been able to automate 97% of routine tasks that occupied a defender’s time just a few years ago, and we can help them respond 50 percent faster,” Johnson said. “And the reason is that we can do a lot of automated threat hunting across all of the platforms in a much quicker way than a human could actually do them.” “This isn’t a takeover by AI,” Beck said. “AI is there to be a force multiplier for security teams. It’s doing a whole load of digital work behind the scenes now to present to human teams genuine decisions that they have to make so that we have a point where those human teams can decide how to take action.” Ultimately, humans have control over the types of tasks they automate, choosing what tasks are automated and how they use AI solutions. While AI is essential to cybersecurity for modern organizations, so are human analysts, and guess what? They’re not going away anytime soon. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,666
2,021
"Report: 83% of companies say 24-hour shutdown causes incapacitating damage | VentureBeat"
"https://venturebeat.com/security/report-83-of-companies-say-24-hour-shutdown-causes-incapacitating-damage"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 83% of companies say 24-hour shutdown causes incapacitating damage Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. New research from Netenrich has found that 83% of companies suffer incapacitating business damage if they are down for 24 hours or more. Recent surges of ransomware and other attacks are creating tremendous business risk , yet security resources remain modest at around 30% of IT budgets. The disconnect between business risk and resources continues, as most security teams’ resources have received increases of less than 10% since employees began to work from home regardless of the growing attack surface and threat vectors. When security professionals are asked how they are trying to improve their company’s security posture, the top answer is upgrading tools (67%), an effort which they also report is being thwarted by integration difficulties, lack of expertise, and an overwhelming surplus of available tools. However, only 35% plan to hire more experienced staff to bring in expertise and grow the team. This low resource rate is compounding the reliance on tools and disproportionately consuming key personnel’s time with its maintenance. Given the new threats, it’s surprising that a majority of security teams are trapped doing the same thing they have been doing for years: adding even more tools and needing more resources to manage them. However, when asked what security professionals actually want to do, the top answer is risk management, followed by incident analysis and threat modeling. This indicates a philosophical shift from reactive tools to a proactive risk-based approach. This report finds 68% of companies prioritize threats according to potential cost to the business, and the impact they fear most is loss of data and negatively affecting customer relationships. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Security professionals state that threat modeling specifically enables a proactive approach by evaluating business risk from understanding the likelihood of attack success and mapping that potential breach to actual business cost. This risk-based approach prioritizes security defenses around the most likely, most impactful attack vectors. Unfortunately, this research finds that less than 40% of companies perform threat modeling today, and only 30% practice external attack surface management. With security team resources growing slowly and consumed by patching, updates, and tool upgrades, combined with a lack of expertise, it’s not surprising that 47% of companies utilize managed service providers (MSPs) today. But with extra resources available, it’s disappointing to find that only 17% of the MSPs are being employed to perform threat modeling. Security professionals know that they are being reactive and acknowledge that repeating the same security methods will not secure their company from growing and evolving attack risks. However, they cannot escape numerous mundane and low-value tasks siphoning their time. Looking to MSPs is a solid strategy that can free teams up to be proactive, focus more on risk management and threat modeling, and initiate the change to a proactive risk-based security approach. Read the full report by Netenrich. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,667
2,021
"Report: 79% of IT teams have seen increase in endpoint security breaches | VentureBeat"
"https://venturebeat.com/security/report-79-of-it-teams-have-seen-increase-in-endpoint-security-breaches"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 79% of IT teams have seen increase in endpoint security breaches Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. According to a new report by HP Wolf Security, 79% of IT teams have seen an increase in rebuild rates, indicating that hackers are becoming more successful at breaching the endpoint and compromising organizations’ devices and data. This sudden increase in rebuild rates is particularly affecting enterprises with 1,000 employees or more — organizations of this kind have the highest average number of rebuilds per month at 67.3. The study also highlights that employees are clicking on more malicious emails. Whether this is because people are less vigilant working from home or because they find it harder to determine what is safe to open, the rising number of rebuilds suggests that hackers have become more successful at breaching the endpoint through malicious links. Overall, the figures indicate that trying to detect and prevent malicious activity in real time is unsustainable in today’s hybrid, hyper-digital world. As working from home becomes more common, users can’t continue to be the single point of security failure, and the compromise of one device should not lead to the compromise of all company assets. IT teams are in dire need of better endpoint security that equips them with greater visibility without imposing restrictions on users. To do this, organizations should look to leverage zero-trust principles and provide users with devices that have security built into the hardware. This will eradicate the need for frequent rebuilds and reduce the security burden on both employees and IT teams. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The report is a comprehensive global study highlighting how the rise of hybrid work is changing user behavior and creating new cybersecurity challenges for IT departments. It’s based on global data (the U.S., the U.K., Mexico, Germany, Australia, Canada, and Japan) from a Toluna study of 1,100 global IT decision-makers and a YouGov study of 8,443 adults who used to be office workers and worked from home the same amount or more than before the pandemic. Read the full report by HP Wolf Security. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,668
2,021
"Report: 64% of security leaders identify obstacles to internal cyber threat intelligence | VentureBeat"
"https://venturebeat.com/security/report-64-of-security-leaders-identify-obstacles-to-internal-cyber-threat-intelligence"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 64% of security leaders identify obstacles to internal cyber threat intelligence Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. According to a new Forrester study commissioned by Cyware, 64% of respondents note that sharing cyber threat intelligence between their organizations’ security operations center (SOC), incident response, and threat intelligence teams is limited. Chief information security offers (CISOs) must better understand the technology and data access challenges preventing their SOCs from enabling the holistic defense required to secure modern organizations. Organizations cite several data silos and data access issues that hinder their ability to achieve collective defense. Seventy-one percent of security leaders report that their teams need access to threat intelligence, security operations data, incident response data, and vulnerability data, yet 65% of respondents find it very challenging to provide security teams with cohesive data access. Top obstacles to unifying technologies include cross-team collaboration (55%), data silos within security teams (47%), discovering and accessing data (45%), and functional silos within security (45%). These common hurdles shine a spotlight on the need for organizations to better unify their security teams, processes, and technologies to bolster defenses and more proactively defend their assets. Those who acknowledge the consequence of not unifying are turning to security tools and functions, such as security orchestration, automation, and response (SOAR) technologies, to support efforts to reach collective defense. Due to difficulties unifying data access, security teams, and security technologies, firms report several consequences tied to hazardous defense issues, including slow threat response (60%), avoidable data breaches (57%), and avoidable human error (53%). In addition, there are financial impacts experienced because of a lack of security unification and automation, such as high mitigation costs and increased cybersecurity spending (51%), as well as fines and compliance issues (45%). The continuously evolving, dynamic threat landscape and cross-team collaboration challenges are motivating leaders to evaluate security firms’ existing security approaches and move towards adopting unified collective defense foundations to remain viable. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The study surveyed 339 cross-industry global security leaders to better understand the top challenges preventing organizations from achieving true collective defense. The report demonstrates common data access challenges in the modern SOC and the impact of siloed security operations on threat response efficacy. Read the full report by Cyware. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,669
2,021
"Report: 37% of IT admins fear software vulnerabilities more than cyber threats | VentureBeat"
"https://venturebeat.com/security/report-37-of-it-admins-fear-software-vulnerabilities-more-than-cyber-threats"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 37% of IT admins fear software vulnerabilities more than cyber threats Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. According to a new survey from cloud directory service JumpCloud , security is the top concern among IT admins, with 37% fearing software vulnerability the most. Many people in enterprises today worry about security issues , starting with executive management, CISOs, and security teams. IT administrators still remain on the front lines, however, tasked with managing user devices, identities, and access to all IT resources. When asked to rank their biggest concerns on a sliding scale, IT admins told JumpCloud that security breaches, hacker attacks, and ransomware are the top three. These include vulnerability exploits (37%), ransomware (35%), use of unsecured networks (33%), and use of the same password across different applications (30%). VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Respondents also gave real-world examples, such as an employee that clicked on a bad link that cost the company $680,000. Another respondent found ransomware on the company president’s machine. In addition, the great resignation sweeping the global economy has hit IT teams hard. One respondent reported losing two IT admins in the same month, and, worst of all, was unable to find qualified replacements. Finally, remote work remains a major adjustment for organizations. Managing devices, user access, identity management, systems, networks, applications, and more is a significant source of stress. While respondents had to adjust within days due to COVID-19 last year, challenges such as lack of hybrid security remain. One respondent said, “An internal server went down, which caused everyone at home having to come into the office to connect to the Wi-Fi to enable their connections to be restarted.” The JumpCloud survey asked 509 U.S.-based and 503 U.K.-based IT professionals about their biggest fears and their scariest IT experiences over the past year. Read the full report by JumpCloud. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,670
2,021
"The DeanBeat: Facebook's ambitions to be the metaverse | VentureBeat"
"https://venturebeat.com/games/the-deanbeat-facebooks-ambitions-to-be-the-metaverse"
"Game Development View All Programming OS and Hosting Platforms Metaverse View All Virtual Environments and Technologies VR Headsets and Gadgets Virtual Reality Games Gaming Hardware View All Chipsets & Processing Units Headsets & Controllers Gaming PCs and Displays Consoles Gaming Business View All Game Publishing Game Monetization Mergers and Acquisitions Games Releases and Special Events Gaming Workplace Latest Games & Reviews View All PC/Console Games Mobile Games Gaming Events Game Culture The DeanBeat: Facebook’s ambitions to be the metaverse Share on Facebook Share on X Share on LinkedIn Mark Zuckerberg speaks at Facebook Connect. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Facebook CEO Mark Zuckerberg ‘s decision to change his corporation’s name to Meta yesterday at the Facebook Connect online event was a historic moment. The decision faced heavy criticism and turned into a meme. Seamus Blackley, co-creator of the Xbox, tweeted that Zuckerberg’s new Meta logo should come with the caption, “Here’s a schematic representation of my testicles.” Many more unkind and funny jokes mocked Zuckerberg’s grand plans. “Here’s a schematic representation of my testicles.” pic.twitter.com/rEXXt2i1b2 — Seamus Blackley (@SeamusBlackley) October 28, 2021 People said the move was tone deaf, coming at a time when Facebook has caught so much flak for putting profits above caring for people or political peace. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! While many people made fun of Zuckerberg, I saw some shrewd moves. Lots of people have mocked the metaverse , the universe of virtual worlds that are all interconnected. But I’ve been thinking about it ever since reading novels such as Snow Crash and Ready Player One. It has been decades in the making, and while it’s here in some small forms like Second Life and Grand Theft Auto Online, and Roblox, it isn’t really here yet. As futurist Matthew Ball said, the metaverse is something that you should feel you’re inside of. And as Zuckerberg said, you should feel a sense of presence, or that feeling you have been transported somewhere else. I have been mocked for believing in the metaverse. Against the advice of some, I created a metaverse event in January 2021, and it had 30 panels on the metaverse and brought together many of gaming’s leaders. (We’ve got another conference, GamesBeat Summit Next on November 9-10 where we’re talking about the metaverse again, and our second annual metaverse event will come in January.) While people laughed at me, I stayed the course. And I never expected Facebook, a company with enough money to do the job, to embrace the metaverse like it did yesterday. Zuckerberg had some awareness he would be criticized. In a press event ahead of the announcement, he said (given the controversies), “I just want to acknowledge that I get that probably there are a bunch of people will say that this isn’t a time to focus on the future. And I want to acknowledge that there are clearly important issues to work on in the present. And we’re committed to continuing to do that and continuing the massive industry-leading effort that we have on that. At the same time, I think there will always be issues in the present. So I think it’s still important to push forward and continue trying to create what the future is.” All of us science fiction geeks have been laughed at. The smart pundits at places like CNN are mocking Zuckerberg for being such a geek. But Jensen Huang, the CEO of Nvidia, is famous for saying “we’re living in science fiction.” He meant that AI struggled for so long and finally began working around six years ago. Now AI is huge, with 8,500 startups working on so many new technologies, and it is leading to advances in other areas. And yes, AI is one of the necessary breakthroughs that we need to build the metaverse. Because we succeeded with AI, we have a chance to succeed with the metaverse. I think a lot of smart people realize that, and they’re dreaming about it too alongside Zuckerberg. I know we are a long way from the real metaverse that we all want to be amazingly immersive and instantaneous and ubiquitous regardless of our locations in the world. What we are hoping for is just a long way from reality. But Zuckerberg’s cool images and videos got the feeling right. If he can deliver on his animated visions with the real thing, then that would be really something. If I were to bring up my favorite adage again — follow the money — I would conclude that so much money is going into the metaverse that it is going to happen. You don’t orchestrate something so huge, something on the scale of the Manhattan Project, and then come out of it on the other side without an atomic bomb. The metaverse will happen because capital is betting that it will happen, and I’ll grant that Zuckerberg has some wisdom in seeing this. The war between Facebook and Apple Above: Facebook’s demo of the metaverse. But he’s not the only one who sees it. I see the war brewing between Facebook and Apple. They both want to control the future of computing. Apple won in one respect with mobile devices, snaring more than 600 million people who use more than a billion Apple devices. Facebook failed to get anywhere with its own smartphone efforts. But Facebook has more than 2.9 billion monthly active users who use its ad-based services such as Facebook, Instagram, and Whatsapp for free. While Apple made $20.6 billion in net income on $83 billion in revenue, while Facebook made $9.2 billion in net income on $29 billion in revenue in the most recent September 30 quarter. Apple CEO Tim Cook doesn’t really like this state of affairs as he views Facebook as a kind of parasite. The business models of these companies are opposites. Apple sells devices, while Facebook sells ads. Apple tries to charge high premiums for its products, far more than rivals such as Microsoft’s Windows or Google’s Android products. It has the highest quality, but it is often inaccessible to the masses. Facebook, by contrast, gives away its products for free, or it sells its Oculus Quest 2 virtual reality headsets for low prices ($300 or more). Its advertising model generates enough revenue to subsidize the free products. In return, Facebook asks for a lot of personal data so that it can better target ads and generate more value with each ad. Above: Facebook said the metaverse will make you feel presence. Cook considers this to be a violation of privacy, and, while it didn’t say it, it tried to hobble Facebook by changing the rules for its Identifier for Advertisers this year. The IDFA data can no longer be used unless users explicitly give their permission to be tracked for ad purposes. Not many people are doing that, based on the very direct wording of the permission prompts. This move has hurt businesses such as Facebook. Suffice to say, they’re at war. And while Zuckerberg said that Facebook is investing (or losing) $10 billion a year in its Facebook Reality Labs division, it’s fair to say that Apple has been investing a huge amount of money in its own VR/AR efforts. Some folks laughed when Magic Leap didn’t succeed after raising more than $2 billion for its mixed reality glasses. But that is chump change compared to what the likes of Apple, Facebook, Microsoft, and Google are spending. No one wants to lose this war for the next generation of computing. Zuckerberg showed he was fully aware of what is at stake here. “We basically think of the metaverse as the successor of the mobile internet, in the sense that the mobile Internet didn’t completely replace everything that came before it,” Zuckerberg said. “It’s not that the metaverse is going to completely replace something that comes before it. But it’s the next platform. In that sense, it’s not a thing that a company builds. It is a broader platform that I think we’re all going to contribute towards building in a way that is open and interoperable.” That’s why Zuckerberg made his moves so early. He bought Oculus for around $2 billion in 2014. He’s been regularly investing in it every year, refining headsets and stoking the VR game and app ecosystem. And yesterday, he threw down the gauntlet by changing his company’s name to Meta and even changed the stock symbol to the metaverse-like MVRS. Zuckerberg said a high-end standalone VR system, dubbed Project Cambria, is being developed, as is the Nazare augmented reality glasses. Those projects are aimed at heading off anything Apple has coming to the market. The right message Above: Facebook is headed toward the metaverse. While many made fun of this as a waste of money, Zuckerberg aligned himself with allies. He said that the metaverse should be open and not built by just one company. By doing this, he could deflect critics. Rather than have people point fingers at Facebook’s walled garden, Zuckerberg could rally some friends together and point fingers at some party that was even less open: Apple. The message was, “We’re open. We’re all in this together.” The subtext was, “They’re not.” He said that gaming will be the way that people step into the metaverse for the first time, as gaming has the infrastructure for economies through virtual goods and engagement with fans. That may be an attempt to drive a wedge, as Apple has pretty much said it doesn’t care about game developers, thanks to the IDFA moves, as those game companies have been hurt by Apple’s moves. Zuckerberg said that privacy and safety have to be built into the metaverse from the start. That is a move to get kids and parents on board. In his announcements, he showed that — through products like Horizon Workroom , Horizon Home, and Horizon Worlds — he plans the metaverse to be a place where we will live, work, exercise, and play. He also said that VR wasn’t the only way to log in to the metaverse. You could also do so with AR glasses, or use a PC or console or smartphone or any other device you want. It isn’t going to be something for only the elite (like Apple customers). It is going to be something for everybody. This is a populist message and one that matches the views of many open metaverse advocates. The poisoned name Above: You can log in on a work account in Facebook’s idea of VR work. However, since Facebook has faced a lot of controversies related to alleged privacy invasions, algorithmic bias, favoring profits over the mental health of its users, and antitrust issues, Zuckerberg doesn’t have the high road. His credibility has been under attack, thanks to leaks by the whistleblower Frances Haugen, who exposed toxic business practices. Facebook’s own cred is also in a tough state. Kent Bye, a VR podcaster, noted in a Clubhouse room that Oculus hasn’t behaved in the most open ways compared to its rival SteamVR. If you want to get cross-platform capability, it has been a lot easier on SteamVR than Oculus. And so a lot of people looked right through Facebook’s intentions in renaming itself. It seemed like renaming was convenient. It shifted attention to the metaverse ambition, but it also deflected criticism of the Facebook brand. Facebook has a PR problem now, and it faces deeply skeptical observers. It will be hard to win allies, win over lost users, and convince regulators it can be trusted. The long road Above: Working in a Facebook Horizon Workroom. These were actually real people talking to me. And so the course for Zuckerberg should be to not try to take the high road. He should take the long road. That is, he should really live up to his ambitions and knock the metaverse out of the park. If he delivers the metaverse and Apple doesn’t, it will be free. It will be accessible, and a lot of people around the world who don’t have access to the finest technology will be able to use it. That is, Zuckerberg should push his business model into a full-scale war with Apple. If he gets the users into the metaverse, the brands will come, and they will give him advertising revenues like he has never seen before. And that will enable him to subsidize his devices and bring them out at prices that everybody on the planet could afford. And I would go one step further. He talked about the creator economy, about people like streamers who are making a living by being influencers that are courted by brands. He said the goal was not only to create a good business, but to create a full economy for creators and developers, so that they get to share the benefits of the metaverse. Play-to-earn Above: Facebook’s metaverse vision. Zuckerberg should not only give away his products and services for free, as he has done in the past, but he should also pay us. He should pay us for giving his social media and his devices our attention. You’ve heard of universal basic income. Zuckerberg’s company is one of those that could make it happen, paying us to use his devices so that we can make a living in his ecosystem. If Zuckerberg does this and raises the tide for all boats, then we would be happy to give him what he wants in exchange. So far, we’ve been giving away our personal information for far too cheap. We should control it, and sell it back to him. That’s something that Apple is never going to do. It’s against its business model or its religion. It wants to respect our privacy, but it also wants to command the highest brand status and the highest profits. But Apple is at risk of falling further behind Facebook when it comes to reaching the most people with the next generation of computing. If Zuckerberg takes the long road, spends his money wisely, and comes up with great devices, he could reclaim his cred and lift many in the world out of poverty. Facebook, or Meta, has a natural chance to be the good guy. It should not drop this ball. Other forces are certainly at work here, like Google, Microsoft, Amazon, and even Netflix. It’s a complicated war, and not just one involving Apple and Facebook. There are also the game companies like Epic Games and others who favor decentralization, or taking power away from big tech. There are small companies now that are espousing paying us to play games, under the “ play-to-earn ” banner. And some nation-states may put their feet down in the name of holding onto power. You can argue with Zuckerberg’s ethics and business tactics. But don’t underestimate him or Meta. The first shot has been fired in the big war. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. Games Beat Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,671
2,021
"Sony investment pushes video startup Kiswe beyond $46M in funding | VentureBeat"
"https://venturebeat.com/entrepreneur/sony-investment-pushes-video-startup-kiswe-beyond-46m-in-funding"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sony investment pushes video startup Kiswe beyond $46M in funding Share on Facebook Share on X Share on LinkedIn Kiswe has a video 2.0 solution. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Sony has made an investment in video startup Kiswe. While the amount wasn’t disclosed, Kiswe has now raised $46 million since its inception in 2013. The Sony Innovation Fund has invested in New Jersey-based Kiswe, which is scaling to meet the demand for its cloud-based interactive video solutions for both content creators and rights holders. Other investors include Revolution’s Rise of the Rest; Hybe Corp. (owner and manager of the popular K-Pop boy band BTS); Ted Leonsis, founder and CEO of Monumental Sports and Entertainment (MSE), and New Enterprise Associates (NEA). The company will use the money to expand its sales efforts for Kiswe’s Cloud Video Engine for content rights holders around the world who aim to deliver interactive video experiences for their consumers and build their own audience data for marketing and learning. “When you combine sports, live music, and ecommerce, our addressable market represents a multi-billion dollar opportunity which we are uniquely positioned to address with our cloud-based production, live-streaming, fan engagement, and video commerce products,” said Kiswe CEO Mike Schabel, in a statement. “We are thrilled that Sony Innovation Fund by IGV recognizes and backs our vision.” At-home consumers have made permanent changes to online behavior, from consuming video content to shopping. The investment is a strong endorsement of Kiswe as the global leader in interactive video solutions that not only deliver extraordinarily high-quality video, at a record-breaking scale, to consumers across the world but also authentically recognize and engage these at-home consumers by bringing their cheering and video to the stage and to the broadcast. The home audience is huge and is served today with lean-back made-for-tv content that assumes a monolithic audience. Kiswe knows that actual audiences encompass a wide spectrum of demographic interests. They want to participate, they want to be recognized for their ‘fandomship’, and they want to matter. Above: Streaming concerts to lapsops. “We think that can happen for all live content, whether that is music, sports, news, talk shows, etc.,” the company said in an email to VentureBeat. “Our technology has been in commercial use for a few years and based on data and insights, we’ve been able to create a formula that makes fans matter. If we can help bring that vision of video to the industry, then we’ll have accomplished what we set out to do.” In total, Kiswe helps rights holders monetize their content directly with consumers, helps consumers feel like a true part of the event, and creates untold opportunities for advertising and data insights. “Kiswe’s impressive video production and service offering is used by some of the largest rights holders and associations on some of the largest global stages and events in sports and entertainment, said Gen Tsuchikawa, chief investment manager for Sony Innovation Fund and CEO of Innovation Growth Ventures. “Their proprietary technology solutions and knowledge put them in a strong position to shape and lead a new era of video capturing the performance of renowned stars as well as aspiring ones.” The Kiswe Cloud Video Engine is the only end-to-end SaaS solution for Video 2.0, where live video intentionally includes the live audience. Kiswe’s turnkey solutions are sold to content rights holders and creators to help them cost-effectively create and deliver content and engage audiences around the world and at scale. With its world-class Cloud Video Engine, Kiswe has delivered the world’s largest digital pay-per-view events and serves the largest sports, media, and entertainment companies with its production, content distribution, and direct audience data solutions. The company has 65 employees. “Kiswe has unique streaming technology which provides a differentiated experience at scale. The keyword here is interactive – we are engaging audiences virtually and enabling them to contribute to events as though they were in-person. One thing we have mastered is the delivery of real-time multi-language closed captioning to serve international audiences,” the company said. “What we learned, however, is that just sending video to a consumer is half the experience. If a fan is watching a live event at home and cheering wildly, and nobody knows, did they make an impact?” The company added, “Part of being a fan is that your contribution matters and impacts the event. Our platform includes a huge portfolio of tools that allow fans to cheer – by hitting buttons or by contributing video, which we use to produce a huge virtual fan camera that we include in the event. We also have multiview technology with fast-switching that empowers fans to curate their own storylines. When fans contribute and see that their contribution makes a difference in the event, the experience becomes exceptional. Then they become part of the story.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,672
2,021
"Report: Climate tech investments surge fivefold since Paris Agreement | VentureBeat"
"https://venturebeat.com/entrepreneur/report-climate-tech-investments-surge-fivefold-since-paris-agreement"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: Climate tech investments surge fivefold since Paris Agreement Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Ever since the Paris Agreement came into effect, climate tech investments have surged around the world, growing nearly five times from $6.6 billion in 2016 to over $32 billion in 2021. According to a study conducted by promotional agency London & Partners and Amsterdam-based database management company Dealroom, the U.S. has been at the forefront of this shift, with climate tech companies in the country drawing a total of $48 billion during the five-year period. China ($18.6B), Sweden ($5.8B), the United Kingdom ($4.3B), and France ($3.7B) continue to follow loosely behind. The figures clearly show how important companies that are applying technologies to reduce greenhouse gas emissions or impacts of climate change have become in the global economy and in the eye of investors. In the US and Canada region, climate tech investments grew nearly six times, from $2.9 billion in 2016 to $17 billion in 2021. Meanwhile, Europe and Asia have seen nearly sevenfold and twofold growth, respectively. U.S.: The climate tech hub London & Partners and Dealroom examined over 5,100 global startups to understand the growth of the climate technology segment since the Paris Agreement — aimed at limiting global temperature rise to 1.5C by 2050. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The firms found that American climate tech startups have been more successful not only in drawing venture capital but also at scaling. As many as 43 unicorns currently operate out of Greater Los Angeles, Greater Boston, New York, and the Bay Area, which cumulatively drew close to $31 billion over five years. Plus, 61 more are on track to hit unicorn status in the near future. No other part of the world is close to having these many climate tech unicorns or this much investment. “This demonstrates both the growing demand for more sustainable technologies from consumers in the U.S. and how the U.S. is leading the way for producing solutions to tackle climate change,” Stephen Feline, director of North America at London & Partners, told VentureBeat in an email. “The U.S. has a more mature climate tech ecosystem than anywhere else in the world, and its scale-ups are producing game-changing companies in this space. We anticipate investment levels to continue to rise as a focus on climate change dominates the political landscape, business community, and the public at large — thus attracting the increasing attention of global investors,” he added. Energy, transportation remain focus Globally, the report notes, about 80% of the climate tech investment has been directed to the energy and transportation sectors, which covers companies such as Tesla and Sunnova. The remaining 20% remains unevenly distributed between food, enterprise software, and the circular economy startups. The latter includes companies working in the fashion and home living industry, as well as those operating in secondhand, refurbished product segments. Meanwhile, in London, most of the focus has been on the energy sector. The city continues to be the U.K.’s climate tech hub, with more than 400 active startups and more than sixfold investment growth over the last five years. “Home to the largest cluster of climate tech companies in Europe and a thriving ecosystem of dedicated VC funds and accelerators alongside world-class researchers and talent, London’s tech sector is coming together to tackle the global climate crisis,” Laura Citron, CEO at London & Partners, said in a statement. “London and U.S. cities like the Bay Area, Los Angeles, and Boston are amongst the leading hubs for climate tech, and this creates lots of opportunities for us to share ideas, talent, and innovation,” she added. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,673
2,021
"Eclipse Foundation launches open source collaboration around software-defined vehicles | VentureBeat"
"https://venturebeat.com/dev/eclipse-foundation-launches-open-source-collaboration-around-software-defined-vehicles"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Eclipse Foundation launches open source collaboration around software-defined vehicles Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. One of the world’s largest open source software (OSS) foundations, the Eclipse Foundation , this week announced an invitation to leaders in the technology sector to join in the commencement of a new working group initiative specifically focused on developing a new grade of open source software-defined vehicles. Alongside the Eclipse Foundation are several top industry players that are joining the foundation’s open source collaborative effort including Microsoft, Red Hat, Bosch, and others. “With digital technologies unlocking the future of accessible, sustainable, and safe transportation experiences, mobility services providers are increasingly looking to differentiate through software innovation,” said Ulrich Homann, corporate vice president and distinguished architect at Microsoft. “By standardizing the development, deployment, and management of software-defined vehicles through collaboration in the open-source space, businesses can bring tailored mobility solutions to their customers faster and can focus on innovations.” The Eclipse Foundation’s initiative aims to provide “individuals and organizations with a mature, scalable, and business-friendly environment for open source software collaboration and innovation,” according to the foundation’s press release. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Benefits for mobility The new working group will focus entirely on building next-generation vehicles based on open source. By open-sourcing this project, the foundation is hoping to pull solutions and innovation from the best and brightest enterprises and individuals across the globe — and doing so with an eye toward creating a strong foundation for software-defined vehicles and future mobility. “The software-defined vehicle will play a key role in the future of mobility,” Christoph Hartung, president and chairman of embedded systems maker ETAS, said in a press release. “The explosive increase in complexity can only be mastered by working closely together as we do in this initiative.” The foundation is focused on fostering an environment from which to pave the way for software-defined vehicles, but it doesn’t stop there. Eclipse is also looking at how both its new working group and the innovation of software-defined vehicles can be used to create robust accessibility options for people with various disabilities and physical needs. “The transfer of personalized functionality across vehicles and brands will be eased — assume a rental car,” Sven Kappel, vice president and head of project for Bosch, told VentureBeat. “So, in the given hardware restraints, the needs of [an] impaired car user could be far faster retrieved and be met by a large developer base with lower implementation cost than classical vehicle architecture and software developing paradigms.” A software-defined future Software-defined vehicles have captured the attention of industry leaders, academics, and the public alike. Next-gen vehicle developers are increasingly looking to provide advanced mobility options to serve the global community, just as smart city technologies and initiatives are similarly on the rise. The benefits from this open-sourced working group can extend beyond vehicles into other industries as well, including cloud computing and manufacturing. A similar open source-focused working initiative in another industry sector could create benefits ranging from collaborative interdisciplinary solutions to ensuring thoughtful inclusion of anticipated consumer needs early on. As the automotive industry, like other sectors, continually pivots toward a software-defined future, interdisciplinary collaboration with open source technology will further enable innovation. Manufacturers and suppliers will be better equipped to leverage standards that make innovations available to more people — for the software-defined vehicle space, this means being able to bring customizable features to drivers and passengers at an accelerated rate, Homann explained to VentureBeat via email. “A global open source community can leverage a wide variety of voices, which can lead to greater participation, such as contributing tools and development principles that can enhance diversity and inclusion,” Homann said. By building and utilizing a strong, open foundation, vehicle manufacturers worldwide will be able to zero in on key differentiators for customers, like mobility services and end-user experience improvements, at the same time that they are saving both time and cost on the non-differentiating elements, such as operating systems, middleware, and communication protocols, Eclipse’s press release claims. “Although we have extensive roots with the automotive community, a project of this scope and scale has never been attempted before,” said Mike Milinkovich, executive director of the Eclipse Foundation. “This initiative enables participants to get in at the ‘ground level’ and ensure they each have an equal voice in this project.” The future of software-defined vehicles The Eclipse Foundation — which has reportedly fostered more than 400 open source projects to date — is eyeing the future as it attempts an open source project unmatched to any of its previous 400. By creating an environment that it anticipates will become “an open ecosystem for deploying, configuring, and monitoring vehicle software in a secure and safe way,” and will assist with achieving a significant transformation for the industry at a large scale. “The end goal of this project is a completely new type of automobile defined in free, open-to-anyone software that can be downloaded into an off-the-shelf chassis. Adding new features to your car will simply require a software update. An enormous first step in a new era of vehicle development,” a press release from Eclipse stated. A transportation and logistics report released in August by the market data firm Statista projects that electronic systems will account for nearly 50% of the total price of a new car by 2030. Additionally, the report claims that even before then, by 2025 about 33% of new cars sold will be operated by an electric battery. In fact, the report predicts that within the next decade, the rise of mobility services and autonomous vehicles will launch a revolution throughout the entire auto sector. In addition, another recent report, titled “ Software-defined vehicle Research Report 2021: Architecture Trends and Industry Panorama ,” points out that in order to keep up with the Joneses of the automotive industry, original equipment manufacturers (OEMs) must “open up vehicle programming to all enterprises by simplifying the development of vehicle software and increasing the frequency of updates, so as to master the ecological resources of developers.” This further underscores the Eclipse Foundation’s ultimate goal of inviting industry leaders to collaboratively build next-generation vehicles based on open source. According to the press release, Eclipse plans to create a space fueled by “transparency, vendor-neutrality, and a shared voice” in order to ensure all participants in the open source-driven project have the opportunity to shape the future of the working group — and the very future of vehicle development itself. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,674
2,021
"Sharpen your skills as a web developer with SitePoint's technology hub | VentureBeat"
"https://venturebeat.com/commerce/sharpen-your-skills-as-a-web-developer-with-sitepoints-technology-hub"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Deals Sharpen your skills as a web developer with SitePoint’s technology hub Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. A successful web developer never stops learning. The quest for more knowledge includes remaining up-to-date on the latest trends in the technology arena. But even seasoned web developers can become overwhelmed when deciding just which course or program from which to learn. SitePoint’s web development hub gives users a perfectly curated technology space that includes more than 600 courses and titles on web development, design and UX, DevOps, and machine learning. Furthermore, purchasers of this deal at the minimum receive a three-year membership, ensuring they stay current and on top of the latest ideas and information. Topics include JavaScript, HTML & CSS, Python, WordPress, and Workflow among others. More than 100,000 members are part of the Sitepoint community and are there to assist with fast feedback and discussion of new tech. There’s also a robust Sitepoint Discord community filled with like-minded technology professionals eager and willing to chat about tech news. G2, the world’s largest tech marketplace, gives this product 4.5 stars, as it’s proven beneficial to many designers, programmers, product creators, and entrepreneurs. New content is added every week, so you’d be hard-pressed to ever find yourself short of content or materials. A few of the titles available to you include Your First Week with React, JavaScript Novice to Ninja, A Beginner’s Guide to Deep Learning with Keras, and CSS Master. All that’s needed to access the platform is a mobile or desktop device that meets basic specifications. For a limited time, a three-year subscription to SitePoint Web Development Hub Premium is available for $52.99 , a savings of 72 percent off its MSRP. Should you prefer a lifetime membership, you can purchase one for $159.99, which is a savings of 77 percent. Seize these low-priced deals today and further add to your command of web development while connecting with others in the profession. VentureBeat Deals is a partnership between VentureBeat and StackCommerce. This post does not constitute editorial endorsement. If you have any questions about the products you see here or previous purchases, please contact StackCommerce support here. Prices subject to change. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,675
2,021
"Report: 85% of consumers rethink purchases from companies that lack focus on climate and diversity | VentureBeat"
"https://venturebeat.com/commerce/report-85-of-consumers-rethink-purchases-from-companies-that-lack-focus-on-climate-and-diversity"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 85% of consumers rethink purchases from companies that lack focus on climate and diversity Share on Facebook Share on X Share on LinkedIn Female farm worker using digital tablet with virtual reality artificial intelligence (AI) for analyzing plant disease in sugarcane agriculture fields. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. According to a new global report by Exasol, 85% of consumers have changed their minds about purchasing from a company because they felt it did not do enough to properly address climate change. Fifty-four percent of corporate social responsibility (CSR) decision-makers share a similar mentality, believing companies that fail to act on “going green” and other sustainability initiatives will no longer exist in ten years. The report indicates that consumers no longer instinctively trust the words of companies from which they have previously purchased goods or services. Instead, businesses need to demonstrate efforts towards key initiatives before consumers reach for their wallets. A substantial majority (68%) of consumers will consider demanding data-backed evidence to prove that companies are making beneficial steps towards addressing global warming, diversity and inclusion (DEI ), as well as ethical and sustainable business practices in the next 36 months. These initiatives are also becoming influential for consumers’ decision-making, as over 86% of respondents have indicated that they will decide whether or not to do business with a company based on its credentials with climate change, DEI initiatives, and ethical and sustainable practices. The latter is also cited by 88% of consumers as a key factor when making purchases. Furthermore, there appears to be a hard deadline as to when businesses can improve their practices and credentials: 66% revealed they would cease buying from a company that didn’t have definitive plans to work on these initiatives within the next three years. Despite consumers’ increasing demand for visible corporate efforts to fight against climate change and a lack of workplace diversity, there appears to be a startling minority of businesses that have either enacted plans or hope to enact plans soon to address these issues. Only 42% of corporations, for example, have a “fully-formed roadmap” in place to ensure climate-friendly business practices are initiated within the next three years, while 31% of those who have no current plans to address these issues still had zero strategies to do so within the next year. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! However, there is growing acknowledgment within corporations that something must be done before customers stray, as well as a growing desire for data in order to inform corporate decisions regarding these crucial initiatives. Eighty-two percent of CSR decision-makers agree that better choices could be made to improve businesses’ climate change, DEI, and ethical and sustainable practices if corporations were given greater access to data-led insights , even while only 22% of CSR respondents appear to be using all of the data available to them. With these results, it is clear that data is becoming a critical resource for businesses and consumers alike as consumer culture pivots in support of various societal issues. Increased accessibility to data should become a basic requirement for many businesses in the next three years. The consumer survey was conducted among 8,056 employees of companies with over 500 employees that have CSR, ESG, or DEI programs in the U.S., U.K., Germany, China, South Africa, and Australia. The CSR decision-maker survey was conducted among 716 CSR decision-makers in the same regions and in companies of a similar size. Read the full report by Exasol. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,676
2,021
"What the metaverse means for brand experiences | VentureBeat"
"https://venturebeat.com/business/what-the-metaverse-means-for-brand-experiences"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages What the metaverse means for brand experiences Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. With the cloud and the edge in place and 5G networking just around the corner, innovative minds are turning to the next question: What, exactly, are we going to do with this infrastructure? Increasingly, the answer is the metaverse. Almost pulled right from the pages of science fiction, the metaverse is a real-world version of the Matrix: a fully immersive environment in which users leverage virtual reality, cryptocurrency, livestreaming, and a host of other technologies to navigate a digital world that, some say, will become as important to daily life as the real world. A work in progress At the moment, this is more of a concept than an actual platform, so the end result will likely differ significantly from what is currently on the drawing board. But with supporters like Facebook, Microsoft, and some of the world’s leading game platforms , the metaverse is starting to draw the interest of both the technological community and, perhaps more importantly, venture capitalists. The metaverse is best described as a 3D World Wide Web or a digital facsimile of the physical world. In this realm, users can move about, converse with other users, make purchases, hold meetings, and engage in all manner of other activities. In the metaverse, all seats at live performances are front and center, sporting events are right behind home plate or center court, and of course, all avatars remain young and beautiful — if that’s what you desire — forever. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! As you might imagine, this is a marketer’s dream. Anheuser-Busch InBev global head of technology and innovation Lindsey McInerney explained to Built In recently that marketing is all about getting to where the people are, and a fully immersive environment is ripe with all manner of possibilities, from targeted marketing and advertising opportunities to fully virtualized brand experiences. Already, companies like ABB are experimenting with metaverse-type marketing opportunities, such as virtual horse racing featuring branded NFTs. Elsewhere, Epic Games recently streamed a virtual concert featuring Ariana Grande on its Fortnite channel, while Hyundai is offering glimpses of what future mobility might look like on Roblox. At the moment, Facebook seems to be the leading advocate of the metaverse. CNN recently noted that CEO Mark Zuckerberg brought it up no less than a dozen times on a recent conference call with analysts. Zuckerberg, in fact, has gone so far as to say Facebook will transition from a mere social media company to a metaverse company , adding that he has been thinking along these lines since middle school. The company is said to have committed $10 billion to its Reality Labs division in order to turn the metaverse into a viable product, and it plans to hire some 10,000 engineers to pick up the pace of development. Metaversal components To get there, of course, Facebook and other proponents will have to integrate a diverse set of technologies and do so in a way that supports universal interaction across platforms. David Brebner, CEO of software development environment Umajin , lists a number of building blocks needed to establish a workable metaverse, starting with a comfortable and low-cost virtual headset, preferably one that also supports augmented reality by overlaying data and images on the physical world. Equally important is for the system to have enough juice to power realistic and even elaborate avatars. Roblox-style block figures and stilted movement won’t cut it; neither will an infinite number of Zoom-like rectangles. And data-sharing will have to happen concurrently among multiple users, something akin to a digital whiteboard room. But given all the controversy surrounding data usage and security on social media platforms of today, is there any reason to think they won’t be amplified in the metaverse? Former FCC Commissioner Tom Wheeler argues that the heightened interconnectedness of the metaverse will increase threats to personal privacy and market competition while making it easier to spread misinformation. Undoubtedly, rules and regulations will have to be put in place, but if recent history is any guide, they will only come about after the metaverse is up and running, and companies like Facebook and Microsoft will more than likely play a large role in creating them. And make no mistake about it: The amount of data that can be collected from the metaverse is astronomical, and whoever controls it will be able to exploit not just individual users but entire markets. No matter how it evolves, however, the metaverse will likely prove to be a vibrant new marketing channel as the number and diversity of providers grows. Forward-leaning enterprises would do well to devise strategies for this new level of human interaction now to avoid having to catch up to the technology later. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,677
2,021
"Report: At least 25% of consumers use biometrics for online security | VentureBeat"
"https://venturebeat.com/business/report-at-least-25-of-consumers-use-biometrics-for-online-security"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: At least 25% of consumers use biometrics for online security Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Biometrics are beginning to move mainstream for online authentication, with at least 25% of consumers now using biometrics in some capacity, according to the FIDO Alliance’s Online Authentication Barometer report. For this research, the FIDO Alliance set out to discover the latest in consumer habits, trends, and the adoption of authentication technologies across the globe. This is the first report in its Online Authentication Barometer series that will periodically review and monitor the state of online authentication in ten countries across the globe. Future releases of the barometer will be able to compare changes in behaviors and attitudes over time. The report found passwords still prevailed over other, more secure authentication methods, with 59% of people using them to log into a work account or computer in the last 60 days. While this is unsurprising, with passwords dominating online authentication for many years, the report did also find indicators that the tide is changing. Most notably, biometrics are gaining traction , both in perception of security and usage — 32% of consumers now believe biometrics are the most secure way to log into their online accounts, apps and devices, compared to passwords at 19%. In addition, biometrics was the second most commonly used method for login, with 28% citing it as their preferred method and 35% having used biometrics to access financial services in the last two months. Consumers are concerned about their online security, and many have taken action such as moving to biometrics (38%) and using authentication software (21%). However, of those who haven’t taken action, a third say it’s because they don’t know how. Combined with the 43% who report that they have strengthened their passwords, an action that is helpful but shouldn’t be relied upon as the only method to keep accounts safe, the barometer has overwhelmingly concluded there is a greater need for education among consumers on how to protect accounts. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Read the full report by FIDO Alliance. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,678
2,021
"Report: 55% of execs say that SolarWinds hack hasn't affected software purchases | VentureBeat"
"https://venturebeat.com/business/report-55-of-execs-say-that-solarwinds-hack-doesnt-impact-software-purchases"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 55% of execs say that SolarWinds hack hasn’t affected software purchases Share on Facebook Share on X Share on LinkedIn IT cloud business software Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. According to a recent study by Venalfi, more than half of executives (55%) with responsibility for both security and software development reported that the SolarWinds hack has had little or no impact on the concerns they consider when purchasing software products for their company. Additionally, 69% say their company has not increased the number of security questions they are asking software providers about the processes used to assure software security and verify code. As the one-year anniversary of the infamous SolarWinds cyberattacks approaches, it’s a great time to evaluate the changes that companies have put in place to protect against similar attacks. These attacks shone a light on a new set of weak spots in organizations’ security controls, especially because software developers are primarily focused on speed and innovation, not security. Attackers know this and are actively taking advantage of it. Unfortunately, Venalfi’s study reveals that while executives are concerned about software supply chain attacks and are aware of the urgent need for action, the data indicates they aren’t taking the steps that will drive change. Today, every business is a software business. If companies don’t work together to make actionable plans to ensure the software that’s used is secure, everyone will remain vulnerable to attacks that target the software supply chain. Even though the risk of supply chain attacks continues to rise, many organizations have not even decided which team is responsible for improving the security of the software supply chain: developers or InfoSec professionals. Venafi’s survey evaluated the opinions of more than 1,000 IT and development professionals, including 193 executives with responsibility for both security and software development, and uncovered a glaring disconnect between executive concern and executive action. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Read the full report by Venalfi. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,679
2,021
"Medical digital twins secure COVID-19 data | VentureBeat"
"https://venturebeat.com/business/medical-digital-twins-secure-covid-19-data"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Medical digital twins secure COVID-19 data Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Dell has partnered with the i2b2 tranSMART foundation to create privacy-preserving digital twins to treat the long-haul symptoms of COVID-19 patients. The project hopes to improve treatment for the 5% of COVID-19 patients who develop chronic health issues. The new tools integrate de-identified data — which refers to data from which all personally identifiable information has been removed — AI, and sophisticated models that allow researchers to perform millions of treatment simulations based on genetic background and medical history. This initiative is part of Dell’s long-term goal to bring digital transformation across the healthcare industry. Jeremy Ford, Dell vice-president of strategic giving and social innovation, told VentureBeat, “AI-driven research and digital twins will support hospitals and research centers globally and contribute to Dell’s goal to use technology and scale to advance health, education and economic opportunity for 1 billion people by 2030.” The i2b2 TranSMART foundation (Informatics for Integrating Biology at the Bedside) is an open-source open-date community for enabling collaboration for precision medicine. The group is focused on projects to facilitate sharing and analysis of sensitive medical data in a way that benefits patients and protects privacy. The partnership between Dell and i2b2 promises to create best practices for applying privacy-enhanced computation (PEC) to medical data. I2b2 chief architect Dr. Shawn Murphy told VentureBeat that medical digital twins are essential because they enable “patients like me” comparisons across very large cohorts of similar medical twins. This will help identify things like biological markers for diseases and compare treatment options for patients who share similar features like age, gender, underlying conditions, and ethnicity. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Multiple sources and types of data go into constructing the medical twins, including a patient’s Electronic Health Record (EHR), consultation information directly from the patient, and waveform data from cardiac monitors, ventilators, and personal fitness tracking devices. “They could be used in the future to help researchers perform millions of individualized treatment simulations to identify the best possible therapy option for each patient, based on genetic background, medical history, and a greater overall knowledge of the long-term treatment effects,” Murphy said. Privacy required for adoption of medical digital twins Privacy is a crucial requirement for the widespread adoption of medical digital twins , which require combining sensitive medical data to create the best models. “ There is a significant amount of work to collect, harmonize, store and analyze the different forms of data coming from multiple locations while maintaining patient privacy and data integrity,” Murphy said. Dell is focused on providing data management hardware, software, and integration services for the project. The data enclave was designed to provide the computational, artificial intelligence, machine learning, and advanced storage capabilities needed for this work. It consists of Dell EMC PowerEdge, PowerStore and PowerScale storage systems, and VMware Workspace ONE. Researchers are still in the early days of identifying vulnerabilities in these architectures and balancing these against performance and workflow bottlenecks. With secure enclaves, sensitive data from various sources is encrypted in transit to a secured server, decrypted, and processed together. It assures the best performance and streamlines workflows of all PEC technologies, but also requires extensive security analysis because the data is processed in the clear. Other PEC approaches, such as homomorphic computing, can process encrypted data but also run much slower and are more challenging to integrate. Murphy said additional infrastructure would be required to support new locations and expand the data pool to include research centers in minority institutions and hospitals outside the U.S. “This is particularly critical for the full representation of diversity in digital twins,” he said. Building a common language The digital twin research started with the creation of the 4CE Consortium , an international coalition of more than 200 hospitals and research centers including data collaboratives across the U.S., France, Germany, Italy, Singapore, Spain, Brazil, India, and the United Kingdom. The 4CE consortium brings together all the sources and types of data to create a ‘common language’ to enable comparisons between different sample populations. This allows comparing medical digital twins that share similar biological markers to see what therapies work most effectively for other patients in the real world. In theory, researchers should be able to pull in data from the EHR, which is designed to manage all the medical history, including treatment options, medical appointments, diagnostic tests, and resulting treatments and prescriptions. However, in practice, Murphy said EHRs are prone to inaccuracies and missing information. For example, in the U.S., the code for rheumatoid arthritis is used in error four out of ten times when the code for osteoarthritis should be used. “This is why we need to aggregate multiple sources and types of data which in unison will spell out the condition of the patient,” Murphy explained. The real value of the EHR comes when combined with real-world patient interviews and other forms of data to create medical digital twins and drive population-level insights. The technology used to understand long-term COVID-19 symptoms can also help create high-resolution, disease-specific medical digital twins that can be used by physicians and researchers for many other applications in the healthcare system. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,680
2,021
"How to go hybrid: The right recipe for mixing on-prem and cloud computing | VentureBeat"
"https://venturebeat.com/business/how-to-go-hybrid-the-right-recipe-for-mixing-on-prem-and-cloud-computing"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest How to go hybrid: The right recipe for mixing on-prem and cloud computing Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. When cloud computing first became widely available, the advantages were clear: ease of use, unlimited capacity, availability, and flexible pricing. Suddenly, enterprises didn’t have to worry about capacity planning or the cumbersome and expensive process of setting up data centers. Cloud vendors empowered organizations to focus on building their products and their core business instead of setting up and maintaining costly infrastructure, allowing them to scale at an unprecedented pace. These advantages led to the rapid migration of many enterprises, startups, and new businesses, from on-premises to the cloud. With time, however, the costs associated with relying totally on the cloud gave rise to new problems. Overprovisioning of cloud resources is one of them. Overprovisioning is a standard practice from the days of on-prem computing that has made the leap to the cloud, primarily as a result of companies employing “lift and shift” strategies for migrating, and it has sent cloud costs skyrocketing. As new companies got off the ground and looked to begin achieving profitability, budget-conscious executives naturally scrutinized the hefty sums devoted to cloud spending and began asking whether all that money was generating sufficient ROI. Dropbox’s decision to abandon AWS and build its own network of data centers, which famously saved the company $ 75 million in just two years , continues to resonate. But not every company is Dropbox, and on-prem carries its own costs and complexities. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! As companies globally began to prioritize reducing infrastructure costs, many began to contemplate the advantages of the hybrid model. This approach, which relies on on-prem infrastructure and uses the cloud to scale out for peak traffic, is poised to strike the perfect balance. Two industry trends are making this possible: managed services offerings launched by cloud vendors such as AWS Outposts , Azure Arc , and the Google Cloud Services Platform, Anthos , which allows for dynamic auto-scaling when necessary. While it would be tempting to view this model as the best of both worlds — the ability to both run off on-prem infrastructure and benefit from cloud resources — and an easy decision to make, hybrid deployment comes with its own unique considerations and challenges. Here are some key points to consider before taking on a hybrid approach for your company. 1. Determine strategy and priorities Not every enterprise needs to bring its computing infrastructure in-house, but companies that have flagged cost reduction and margin growth as strategic priorities should explore doing so. Likewise, enterprises that are less concerned with their margins at the moment and aim to scale and rapidly increase their market share can comfortably stay fully in the cloud to maintain a greater degree of flexibility. 2. Prepare for the realities of on-prem maintenance A hybrid strategy means a return to some of the complexities of on-prem infrastructure and management that enterprises left behind when they blasted off to the cloud. These challenges should by no means prevent a company from going hybrid, but they do require a carefully planned on-premises strategy. The capacity to take on that challenge may be influenced by your institutional memory for self-management. It is essential to have staff who know how to manage data centers, procure servers, and so on. Enterprises returning to on-prem after only a year or two away will be at a clear advantage — they may even still own their facility. Cloud-native organizations will be starting on that journey from scratch and would benefit from bringing in people familiar with self-managed infrastructure. 3. Know, and constantly re-evaluate, your triggers A hybrid approach is all about planning for the usage threshold at which you scale your application out to the cloud. That necessitates effective capacity prediction. A general rule of thumb would be to plan your on-prem for average traffic, not peak, and scale-out to the cloud when experiencing peak traffic. 4. Get comfortable with fragmentation Scaling out from your metal to the cloud and back again is no mean feat from an infrastructure perspective. But it can also stress and expose aspects of your application itself. If parts of your app are simultaneously running in multiple areas, you have to ensure that your data and code base are uniform across each. To put it plainly, think about the cloud as simply an additional data center — you’ll need to guarantee constant data updates to ensure consistency. 5. Consider the lock-in It may seem counterintuitive, but using the cloud on top of data centers purely to service excess demand actually comes with its own kind of hook. Instead of having the option to mix and match data center and cloud vendors, the major cloud vendors’ managed services solutions for running in data centers scale-out exclusively to their own clouds. The vendor choices for hybrid are just the same as choosing a pure-cloud vendor. Consider whether you can be tied to a particular API and how large the ecosystem is. Identify the specific features that are priorities for your businesses, as some are only available from certain providers. 6. Let location guide you Faster 5G speeds for consumers may reduce the need for enterprises to keep complex networks of regional services. But the value received from doing so depends on your user footprint, as well as 5G’s roll-out roadmap. In theory, if you serve customers only in the US, enhanced 5G speeds may allow you to jettison those “east” and “west” cloud regions,# and instead consolidate onto a single US data center. But a distributed customer base may still demand multi-regional power that the cloud is best positioned to provide, while 5G’s still unrealized rollout means uncertainty abounds – and will for several years to come. Hybrid: The next frontier Over the next five years, hybrid infrastructure deployments are likely to become increasingly commonplace as many businesses reach a point at which cost-saving becomes imperative and the barriers to entry continue to break down. As cloud vendors’ managed-services APIs becoming more user-friendly, hybrid will emerge as a dominant go-to solution. Asaf Ezra is CEO and Co-Founder of Granulate. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,681
2,021
"Facebook stops just short of rebranding to 'The Web' | VentureBeat"
"https://venturebeat.com/business/facebook-stops-just-short-of-rebranding-to-the-web"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Opinion Facebook stops just short of rebranding to ‘The Web’ Share on Facebook Share on X Share on LinkedIn Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Facebook changed its corporate name to Meta today. Facebook.com is still Facebook.com, though. This is a move to embrace the metaverse, which is a concept that many tech-industry executives and investors view as the future of the internet. But it also feels like a poorly timed attempt to capitalize on a buzzword that has already lost much of its meaning. Aspirationally, the metaverse is an alternate reality that lives online. Functionally, however, the metaverse is a platform or series of platforms that share interoperable code to enable individuals to carry their online identities and (more importantly) their purchases from one experience to the next. This concept is popular with investors, which is one reason why Facebook is planting its flag so firmly into the space. Well, that and because Facebook has poisoned its name with regular scandals. But metaverse isn’t a buzzword because it is the future. It’s popular because it is already making money in the present. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! The temptation for many industry observers is to point to VR apps as the obvious future of the metaverse. Meanwhile, massive connected platforms like Roblox are building giant, connected worlds. For a new generation of internet users, Roblox already provides many of the key elements that metaverse latecomers like Zuckerberg are still trying to figure out. Roblox enables you to carry your character and digital wallet from one game to the next. Your cosmetics carry over, so you can always express yourself through your purchases. And the platform has no shortage of content because Roblox already has a robust user-generated-content system that is regularly producing new hit experiences. Meta is already an outdated name What is especially odd about Facebook’s new name is that it already feels old. While investors and executives love to throw around “metaverse” in earnings calls and conference panels, the public seems generally less excited. That lack of enthusiasm is likely do to an absence of vision. People don’t care how they fit into some tech executive’s online petting zoo. They care about what the technology does for their specific needs. For adults, how do these tools enable new opportunities for commerce? Wider audiences, meanwhile, look to platforms to empower their social connections. And those communities are finding answers to those needs on platforms like Roblox, Fortnite, and VRChat without ever even learning the term “metaverse” in the first place. Professionals who make content for Roblox have begun saying the term more frequently, but that’s only because it increases attractiveness to investors. By the time Meta figures out its metaverse business model, few people will use that word to describe any of this. It’ll sound strange in the same way “World Wide Web” does today. Meta would be like Microsoft changing its name to The Web in 1999 — only, somehow, even worse. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,682
2,021
"Cnvrg.io develops on-demand service to run AI workloads across infrastructures | VentureBeat"
"https://venturebeat.com/business/cnvrg-io-develops-service-to-run-ai-workloads-across-infrastructures"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Cnvrg.io develops on-demand service to run AI workloads across infrastructures Share on Facebook Share on X Share on LinkedIn Cnvrg.io has launched a Metacloud service to help AI developers run their workloads across any mainstream infrastructure. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Cnvrg.io, the Intel-owned company that offers a platform to help data scientists build and deploy machine learning applications, has opened early access to a new managed service called Cnvrg.io Metacloud. The offering, as the company explains, gives AI developers the flexibility to run, test, and deploy AI and machine learning (ML) workloads on a mix of mainstream infrastructure and hardware choices, even within the same AI/ML workflow or pipeline. Cnvrg.io Metacloud: Flexibility for AI developers AI experts can often find themselves struggling to scale their projects due to the limitations of the cloud or on-premise infrastructure in use. They do get the option to switch to a new environment, but that means re-instrumenting a completely new stack as well as spending a lot of cash and time. This eventually keeps most of the users locked on a single vendor, making it a major obstacle to scaling and operationalizing AI. Cnvrg.io Metacloud tackles this challenge with a flexible software-as-a-service (SaaS) interface, where developers can pick cloud or on-premise compute resources and storage services of their choice to match the demand of their AI/ML workloads. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The solution has been designed using cloud-native technologies such as containers and Kubernetes, which enables developers to pick any infrastructure provider from a partner menu to run their project. All users need to do is create an account, select the AI/ML infrastructure (any public cloud, on-premise, co-located, dev cloud, pre-release hardware, and more), and run the workload, the company said. Plus, since there is no commercial commitment, developers can always change to a different infrastructure to meet growing project demands or budget constraints. The current list of supported providers includes Intel, AWS, Azure, GCP, Dell, Redhat, VMWare, and Seagate. “AI has yet to meet its ultimate potential by overcoming all the operational complexities. The future of machine learning is dependent on the ability to deliver models seamlessly using the best infrastructure available,” Yochay Ettun, CEO and cofounder of Cnvrg.io, said in a statement. “Cnvrg.io Metacloud is built to give flexibility and choice to AI developers to enable successful development of AI instead of limiting them, so enterprises can realize the full benefits of machine learning sooner,” he added. Cnvrg.io Metacloud will be provided as part of the Cnvrg.io full-stack machine learning operating system, designed to help developers build and deploy machine learning models. The early access version of the solution can be accessed upon request via the company website. Notably, this is the first major announcement from Cnvrg.io since its acquisition by Intel in 2020. Prior to the deal, the company had raised about $8 million from multiple investors, including Hanaco Venture Capital and Jerusalem Venture Partners. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,683
2,021
"AT&T and H2O collab on feature store for AI developers | VentureBeat"
"https://venturebeat.com/business/att-h2o-collab-on-feature-store-for-ai-developers"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AT&T and H2O collab on feature store for AI developers Share on Facebook Share on X Share on LinkedIn AT&T building Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Telecom giant AT&T and Mountain View, California-headquartered AI cloud company H2O have jointly launched an artificial intelligence feature store for enterprises. The repository, available as a paid software platform, enables data scientists, developers, and engineers to discover, share, and reuse machine learning features to speed up their AI project deployments. According to the companies, it can be used for work on personalization and recommendation engines as well as models that are aimed at forecasting, as well as the optimization of dynamic pricing, supply chain, and logistics and transportation. The development comes as more and more organizations turn to AI implementation to generate actionable insights and predictions from a huge trove of data. AT&T originally had this solution in production use for network optimization, fraud prevention, tax calculations, predictive maintenance, among other things. Features in machine learning When it comes to building machine learning models, data is of the utmost importance. However, raw data is not the key to a well-performing algorithm. The information gathered first must be cleaned and enriched with features — individual independent variables or characteristics that act as the input for the system. The quality of these features defines the quality of the model as well as the accuracy of its predictions. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Typically, AI experts apply domain knowledge of the data and engineering tools to extract and create features. The entire process takes months and has to be repeated for every new AI project (adding to the cost), even if it is under the same organization. A feature store strives to solve this challenge by serving as the home for commonly used features. With this solution, experts could create new features for a project and then add them into the store, ensuring they could be reused if required at a later stage. Databricks, Tecton, Molecula, Hopsworks, Splice Machine, and Amazon Web Services (AWS) are the leading players that offer feature stores to accelerate MLOps. Additional capabilities The latest offering from H2O and AT&T not only performs the core function of a feature store, but also comes equipped with multiple additional capabilities. The store offers integration with multiple data and machine learning pipelines, which can be applied to an on-premise data lake or by leveraging cloud and SaaS providers. It is integrated with Snowflake, Databricks, Apache Spark, H2O Sparkling Water, Python, Scala, and Java. Then, the solution also offers an automatic recommendation engine that learns over time and recommends new features and feature updates to improve the AI model performance of the user. This way, data scientists can simply review the suggested updates and accept the recommendations best suited for their model. FeatureRank scores all the features on the store based on their popularity and value. “We are building AI right into the feature store and have taken an open, modular, and scalable approach to tightly integrate into the diverse feature engineering pipelines while preserving sub-millisecond latencies needed to react to fast-changing business conditions,” Sri Ambati, CEO and founder of H2O.ai, said in a statement. The company says any firm engaged in AI development can adopt its store, starting from financial services and health organizations to pharmaceutical makers, retail, and software developers. “Feature stores are one of the hottest areas of AI development right now because being able to reuse and repurpose data engineering tools is critical as those tools become increasingly complex and expensive to build,” Andy Markus, chief data officer at AT&T, said. “With our expertise in managing and analyzing huge data flows, combined with H2O.ai’s deep AI expertise, we understand what business customers are looking for in this space and our feature store offering meets this need,” Markus added. According to a PwC study , AI will $15.7 trillion to the global economy by 2030. Leading this growth are China and North America, which will drive the greatest economic gains at $10.7 trillion. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,684
2,021
"Activ Surgical harnesses AI and machine learning to collaborate with surgeons | VentureBeat"
"https://venturebeat.com/business/activ-surgical-harnesses-ai-and-machine-learning-to-collaborate-with-surgeons"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Activ Surgical harnesses AI and machine learning to collaborate with surgeons Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. In 2016, Dr. Peter Kim, founder of Activ Surgical, a digital surgery company, demonstrated a proof of concept of fully autonomous robotic surgery on soft tissue, suturing, or stitching up a wound. Since then, Activ Surgical has been working on harnessing machine learning , augmented reality, and other advanced technologies to develop new ways of collaborating with surgeons. “We want to keep surgeons in the loop, to give them more data than they ever had before,” says CEO Todd Usen. Usen likens Activ Surgical’s work in surgery to crossing goalposts in the drive toward autonomous driving : It might take a while to get there, but the industry is ramping up systematically. Parking assists and side-view mirrors that alert drivers to vehicles in a blind spot are examples of rungs on the road toward autonomy. Such collaborations with the driver are similar to how Activ Surgical envisions its role with surgeons today. An example of such a collaboration: an FDA 510(k)-approved hardware component called ActivSight, which fits between an endoscope and a camera system and helps surgeons in real time as they perform operations. Activ Surgical is betting that giving surgeons a clearer picture of blood flow, known in medical terms as perfusion, might improve outcomes in laparoscopic surgeries. Take the case of an anastomotic leak in colorectal procedures, which can be devastating. It occurs because of inadequate blood flow at a surgical site that has been cut and sewn back together. Pressing a button labeled “Perfusion View” gives a clearer picture of blood flow — or lack thereof. The vessels with blood flow cut off show up in different colors. Typically, surgeons use dyes to track perfusion, but there is a time lag between injection of dye and when results can actually be seen. Usen says that using ActivSight allows for both real-time visualization of perfusion and without any dyes. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! The Perfusion tool is the first “Insight” delivered through the company’s augmented reality-based software suite, ActivInsights. Usen expects it to be the first of many coming down the pike, feeding on a foundation of annotated imaging data from actual surgeries. Pressing machine learning into service “We’re allowing surgeons all over the world to collaborate, to identify and label data so our machine learning algorithms can take over and route the results to all users,” Usen says. In effect, surgeons help build a repository of annotated anatomy components on the company’s ActivEdge platform. Activ Surgical can use machine learning on this foundation to train data models and develop future insights. Identifying veins and arteries, nerves, and lymph nodes without dyes are among the many possibilities. “Just like over time autonomous cars can identify a red octagonal sign and know to eventually stop, we need to identify all critical organs as critical organs even though an individual’s organs might look different from another,” Usen says. Driven by machine learning, Activ Surgical’s autonomous annotation model can annotate a one-hour surgical video in just about 10 minutes. That’s about 20,000 times faster than if it were all done by humans. The ActivInsights software suite uses AR to overlay critical information such as blood flow over actual imagery of the organs. A TaaS model of delivery Activ Surgical packages the ActivInsights software suite and the ActivSight hardware together. Hospitals can subscribe to the package through a technology-as-a-service (TaaS) model and access new insights as they roll in. The market for laparoscopic surgeries — the kind where ActivSight can be pressed into service — is sizable, with more than 13 million such procedures conducted annually over the world. The surgical robotics market is also booming; it is expected to grow at a compounded annual rate of 44% until 2028. Investors are betting on Activ Surgical’s digital surgery approach: In late September, the company announced a $45 million series B financing round. It plans to use part of the new capital to develop its first ML-based insights. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,685
2,021
"A call for increased visual representation and diversity in robotics | VentureBeat"
"https://venturebeat.com/business/a-call-for-increased-visual-representation-and-diversity-in-robotics"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages A call for increased visual representation and diversity in robotics Share on Facebook Share on X Share on LinkedIn If a picture is worth a thousand words, then we can save a forest's worth of outreach, diversity, and equity work, simply by showing people what women in robotics really do. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Sometimes it’s the obvious things that are overlooked. Why aren’t there pictures of women building robots on the internet? Or if they are there, why can’t we find them when we search? I have spent years decades doing outreach activities, providing STEM opportunities, and doing women in robotics speaker or networking events. So I’ve done a lot of image searches looking for a representative picture. I have scrolled through page after page of search results ranging from useless to downright insulting every single time. Finally, I counted. Above: Graph: Image search results via Google showing results of what comes up when the term “woman building robot” is searched. My impressions were correct. The majority of the images you find when you look for ‘woman building robot’ are of female robots. This is not what happens if you search for ‘building robot’, or ‘man building robot’. That’s the insulting part, that this misrepresentation and misclassification hasn’t been challenged or fixed. Sophia the robot, or the ScarJo bot, or a sexbot has a much greater impact on the internet than women doing real robotics. What if male roboticists were confronted with pictures of robotic dildos whenever they searched for images of their work? Above: Example of image results from Andra Keay’s Google search for ‘women building robots’ The number of women in the robotics industry is hard to gauge. Best estimates are 5% in most locations, perhaps 10% in some areas. It is slowly increasing, but then the robotics industry is also in a period of rapid growth and everyone is struggling to hire. To my mind, the biggest wasted opportunity for a young robotics company growing like Topsy is to depend on the friends of founders network when it leads to homogenous hiring practices. The sooner you incorporate diversity, the easier it will be for you to scale and attract talent. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! For a larger robotics company, the biggest wasted opportunity is not fixing retention. Across the board in the tech industry, retention rates for women and underrepresented minorities are much worse than for pale males. That means that you are doing something wrong. Why not seriously address the complaints of the workers who leave you? Otherwise, you’ll never retain diverse hires, no matter how much money you throw at acquiring them. The money wasted in talent acquisition when you have poor retention should instead be used to improve childcare, or flexible work hours, or support for affinity groups, or to fire the creep that everyone complains about, or restructure so that you increase the number of female and minority managers. The upper echelons are echoing with the absence of diversity. On the plus side, the number of pictures of girls building robots has definitely increased in the last ten years. As my own children have grown, I’ve seen more and more images showing girls building robots. But with two daughters now leaving college, I’ve had to tell them that robotics is not one of the female-friendly career paths (if any of them are). Unless they are super passionate about it. Medicine, law, or data analytics might be better domains for their talents. As an industry, we can’t afford to lose bright young women. We can’t afford to lose talented older women. We can’t afford to overlook minority hires. The robotics industry is entering exponential growth. Capital is in abundance, market opportunities are in abundance. Talent is scarce. These days, I’m focused on supporting professional women in the robotics community, industry, or academia. These are women who are doing critical research and building cutting-edge robots. What do solutions look like for them? Our wonderful annual Ada Lovelace Day list hosted on Robohub has increased the awareness of many ‘new’ faces in robotics. But we have been forced to use profile pictures, primarily because that’s what is available. That’s also the tradition for profile pieces about the work that women do in robotics. The focus is on the woman, not the woman building or programming, or testing the robot. That means that the images are not quite right as role models. Above: Further examples from Andrea Keay’s image search results that better represented females in robotics A real role model shows you the way forward. And that the future is in your hands. The Civil Rights activist Marian Wright Edelman said, “You can’t be what you can’t see.” Above: A set of images from Andra Keay’s search results displaying the few good images found in the search more accurately representing women working in robotics. So Women in Robotics has launched a photo challenge. Our goal is to see more than 3 images of real women building robots in the top 100 search results. Our stretch goal is to see more images of women building robots than there are of female robots in the top 100 search results! Take great photos following these guidelines, hashtag your images #womeninrobotics #photochallenge #ibuildrobots, and upload them to Wikimedia with a creative commons license so that we can all use them. We’ll share them on the Women in Robotics organization website, too. Above: Andra Keay’s guidelines for what makes a great, accurate, and realistic photo representing women in robotics. Hey, we’d also love mentions of Women in Robotics in any citable fashion! Wikipedia won’t let us have a page because we don’t have third-party references, and sadly, the mention of our Ada Lovelace Day lists by other organizations have not credited us. We are now an official 501c3 organization, registered in the US, with the mission of supporting women and non-binary people who work in robotics, or who are interested in working in robotics. Above: Additional details of the women in robotics photo challenge additional example and call for submission to [email protected]. If a picture is worth a thousand words, then we can save a forest’s worth of outreach, diversity, and equity work, simply by showing people what women in robotics really do. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,686
2,021
"The untapped potential of HPC + graph computing | VentureBeat"
"https://venturebeat.com/ai/the-untapped-potential-of-hpc-graph-computing"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest The untapped potential of HPC + graph computing Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. In the past few years, AI has crossed the threshold from hype to reality. Today, with unstructured data growing by 23% annually in an average organization, the combination of knowledge graphs and high performance computing (HPC) is enabling organizations to exploit AI on massive datasets. Full disclosure: Before I talk about how critical graph computing +HPC is going to be, I should tell you that I’m CEO of a graph computing, AI and analytics company, so I certainly have a vested interest and perspective here. But I’ll also tell you that our company is one of many in this space — DGraph, MemGraph, TigerGraph, Neo4j, Amazon Neptune, and Microsoft’s CosmosDB, for example, all use some form of HPC + graph computing. And there are many other graph companies and open-source graph options, including OrientDB, Titan, ArangoDB, Nebula Graph, and JanusGraph. So there’s a bigger movement here, and it’s one you’ll want to know about. Knowledge graphs organize data from seemingly disparate sources to highlight relationships between entities. While knowledge graphs themselves are not new (Facebook, Amazon, and Google have invested a lot of money over the years in knowledge graphs that can understand user intents and preferences), its coupling with HPC gives organizations the ability to understand anomalies and other patterns in data at unparalleled rates of scale and speed. There are two main reasons for this. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! First, graphs can be very large: Data sizes of 10-100TB are not uncommon. Organizations today may have graphs with billions of nodes and hundreds of billions of edges. In addition, nodes and edges can have a lot of property data associated with them. Using HPC techniques, a knowledge graph can be sharded across the machines of a large cluster and processed in parallel. The second reason HPC techniques are essential for large-scale computing on graphs is the need for fast analytics and inference in many application domains. One of the earliest use cases I encountered was with the Defense Advanced Research Projects Agency (DARPA), which first used knowledge graphs enhanced by HPC for real-time intrusion detection in their computer networks. This application entailed constructing a particular kind of knowledge graph called an interaction graph, which was then analyzed using machine learning algorithms to identify anomalies. Given that cyberattacks can go undetected for months (hackers in the recent SolarWinds breach lurked for at least nine months) , the need for suspicious patterns to be pinpointed immediately is evident. Today, I’m seeing a number of other fast-growing use cases emerge that are highly relevant and compelling for data scientists, including the following. Financial services — fraud, risk management and customer 360 Digital payments are gaining more and more traction — more than three-quarters of people in the US use some form of digital payments. However, the amount of fraudulent activity is growing as well. Last year the dollar amount of attempted fraud grew 35%. Many financial institutions still rely on rules-based systems, which fraudsters can bypass relatively easily. Even those institutions that do rely on AI techniques can typically analyze only the data collected in a short period of time due to the large number of transactions happening every day. Current mitigation measures therefore lack a global view of the data and fail to adequately address the growing financial fraud problem. A high-performance graph computing platform can efficiently ingest data corresponding to billions of transactions through a cluster of machines, and then run a sophisticated pipeline of graph analytics such as centrality metrics and graph AI algorithms for tasks like clustering and node classification, often using Graph Neural Networks (GNN) to generate vector space representations for the entities in the graph. These enable the system to identify fraudulent behaviors and prevent anti-money laundering activities more robustly. GNN computations are very floating-point intensive and can be sped up by exploiting tensor computation accelerators. Secondly, HPC and knowledge graphs coupled with graph AI are essential to conduct risk assessment and monitoring, which has become more challenging with the escalating size and complexity of interconnected global financial markets. Risk management systems built on traditional relational databases are inadequately equipped to identify hidden risks across a vast pool of transactions, accounts, and users because they often ignore relationships among entities. In contrast, a graph AI solution learns from the connectivity data and not only identifies risks more accurately but also explains why they are considered risks. It is essential that the solution leverage HPC to reveal the risks in a timely manner before they turn more serious. Finally, a financial services organization can aggregate various customer touchpoints and integrate this into a consolidated, 360-degree view of the customer journey. With millions of disparate transactions and interactions by end users — and across different bank branches – financial services institutions can evolve their customer engagement strategies, better identify credit risk, personalize product offerings, and implement retention strategies. Pharmaceutical industry — accelerating drug discovery and precision medicine Between 2009 to 2018, U.S. biopharmaceutical companies spent about $1 billion to bring new drugs to market. A significant fraction of that money is wasted in exploring potential treatments in the laboratory that ultimately do not pan out. As a result, it can take 12 years or more to complete the drug discovery and development process. In particular, the COVID-19 pandemic has thrust the importance of cost-effective and swift drug discovery into the spotlight. A high-performance graph computing platform can enable researchers in bioinformatics and cheminformatics to store, query, mine, and develop AI models using heterogeneous data sources to reveal breakthrough insights faster. Timely and actionable insights can not only save money and resources but also save human lives. Challenges in this data and AI-fueled drug discovery have centered on three main factors — the difficulty of ingesting and integrating complex networks of biological data, the struggle to contextualize relations within this data, and the complications in extracting insights across the sheer volume of data in a scalable way. As in the financial sector, HPC is essential to solving these problems in a reasonable time frame. The main use cases under active investigation at all major pharmaceutical companies include drug hypothesis generation and precision medicine for cancer treatment, using heterogeneous data sources such as bioinformatics and cheminformatic knowledge graphs along with gene expression, imaging, patient clinical data, and epidemiological information to train graph AI models. While there are many algorithms to solve these problems, one popular approach is to use Graph Convolutional Networks (GCN) to embed the nodes in a high-dimensional space, and then use the geometry in that space to solve problems like link prediction and node classification. Another important aspect is the explainability of graph AI models. AI models cannot be treated as black boxes in the pharmaceutical industry as actions can have dire consequences. Cutting-edge explainability methods such as GNNExplainer and Guided Gradient (GGD) methods are very compute-intensive therefore require high-performance graph computing platforms. The bottom line Graph technologies are becoming more prevalent, and organizations and industries are learning how to make the most of them effectively. While there are several approaches to using knowledge graphs, pairing them with high performance computing is transforming this space and equipping data scientists with the tools to take full advantage of corporate data. Keshav Pingali is CEO and co-founder of Katana Graph , a high-performance graph intelligence company. He holds the W.A.”Tex” Moncrief Chair of Computing at the University of Texas at Austin, is a Fellow of the ACM, IEEE and AAAS, and is a Foreign Member of the Academia Europeana. DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation. If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers. You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,687
2,021
"New deep reinforcement learning technique helps AI to evolve | VentureBeat"
"https://venturebeat.com/ai/new-deep-reinforcement-learning-technique-helps-ai-to-evolve"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages New deep reinforcement learning technique helps AI to evolve Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Hundreds of millions of years of evolution have produced a variety of life-forms, each intelligent in its own fashion. Each species has evolved to develop innate skills, learning capacities, and a physical form that ensures survival in its environment. But despite being inspired by nature and evolution, the field of artificial intelligence has largely focused on creating the elements of intelligence separately and fusing them together after the development process. While this approach has yielded great results, it has also limited the flexibility of AI agents in some of the basic skills found in even the simplest life-forms. In a new paper published in the scientific journal Nature , AI researchers at Stanford University present a new technique that can help take steps toward overcoming some of these limits. Called “deep evolutionary reinforcement learning,” or DERL, the new technique uses a complex virtual environment and reinforcement learning to create virtual agents that can evolve both in their physical structure and learning capacities. The findings can have important implications for the future of AI and robotics research. Evolution is hard to simulate In nature, the body and brain evolve together. Across many generations, every animal species has gone through countless cycles of mutation to grow limbs, organs, and a nervous system to support the functions it needs in its environment. Mosquitos are equipped with thermal vision to spot body heat. Bats have wings to fly and an echolocation apparatus to navigate dark spaces. Sea turtles have flippers to swim with and a magnetic field detector system to travel very long distances. Humans have an upright posture that frees their arms and lets them see the far horizon, hands and nimble fingers that can manipulate objects, and a brain that makes them the best social creatures and problem solvers on the planet. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Interestingly, all these species descended from the first life-form that appeared on Earth several billion years ago. Based on the selection pressures caused by the environment, the descendants of those first living beings evolved in many directions. Studying the evolution of life and intelligence is interesting, but replicating it is extremely difficult. An AI system that would want to recreate intelligent life in the same way that evolution did would have to search a very large space of possible morphologies, which is extremely expensive computationally. It would need a lot of parallel and sequential trial-and-error cycles. AI researchers use several shortcuts and predesigned features to overcome some of these challenges. For example, they fix the architecture or physical design of an AI or robotic system and focus on optimizing the learnable parameters. Another shortcut is the use of Lamarckian rather than Darwinian evolution, in which AI agents pass on their learned parameters to their descendants. Yet another approach is to train different AI subsystems separately (vision, locomotion, language, etc.) and then tack them on together in a final AI or robotic system. While these approaches speed up the process and reduce the costs of training and evolving AI agents, they also limit the flexibility and variety of results that can be achieved. Deep evolutionary reinforcement learning In their new work, the researchers at Stanford aim to bring AI research a step closer to the real evolutionary process while keeping the costs as low as possible. “Our goal is to elucidate some principles governing relations between environmental complexity, evolved morphology, and the learnability of intelligent control,” they wrote in their paper. Within the DERL framework, each agent uses deep reinforcement learning to acquire the skills required to maximize its goals during its lifetime. DERL uses Darwinian evolution to search the morphological space for optimal solutions, which means that when a new generation of AI agents are spawned, they only inherit the physical and architectural traits of their parents (along with slight mutations). None of the learned parameters are passed on across generations. “DERL opens the door to performing large-scale in silico experiments to yield scientific insights into how learning and evolution cooperatively create sophisticated relationships between environmental complexity, morphological intelligence, and the learnability of control tasks,” the researchers wrote. Simulating evolution For their framework, the researchers used MuJoCo, a virtual environment that provides highly accurate rigid-body physics simulation. Their design space is called Universal Animal (Unimal), in which the goal is to create morphologies that learn locomotion and object-manipulation tasks in a variety of terrains. Each agent in the environment is composed of a genotype that defines its limbs and joints. The direct descendant of each agent inherits the parent’s genotype and goes through mutations that can create new limbs, remove existing limbs, or make small modifications to characteristics, such as the degrees of freedom or the size of limbs. Each agent is trained with reinforcement learning to maximize rewards in various environments. The most basic task is locomotion, in which the agent is rewarded for the distance it travels during an episode. Agents whose physical structures are better suited for traversing terrain learn faster to use their limbs for moving around. To test the system’s results, the researchers generated agents in three types of terrains: flat (FT), variable (VT), and variable terrains with modifiable objects (MVT). The flat terrain puts the least selection pressure on the agents’ morphology. The variable terrains, on the other hand, force the agents to develop a more versatile physical structure that can climb slopes and move around obstacles. The MVT variant has the added challenge of requiring the agents to manipulate objects to achieve their goals. The benefits of DERL Above: Deep evolutionary reinforcement learning generates a variety of successful morphologies across different environments. One of the interesting findings of DERL is the diversity of the results. Other approaches to evolutionary AI tend to converge on one solution because new agents directly inherit the physique and learnings of their parents. But in DERL, only morphological data is passed on to descendants; the system ends up creating a diverse set of successful morphologies, including bipeds, tripeds, and quadrupeds with and without arms. At the same time, the system shows traits of the Baldwin effect , which suggests that agents that learn faster are more likely to reproduce and pass on their genes to the next generation. DERL shows that evolution “selects for faster learners without any direct selection pressure for doing so,” according to the Stanford paper. “Intriguingly, the existence of this morphological Baldwin effect could be exploited in future studies to create embodied agents with lower sample complexity and higher generalization capacity,” the researchers wrote. Finally, the DERL framework also validates the hypothesis that more complex environments will give rise to more intelligent agents. The researchers tested the evolved agents across eight different tasks, including patrolling, escaping, manipulating objects, and exploration. Their findings show that in general, agents that have evolved in variable terrains learn faster and perform better than AI agents that have only experienced flat terrain. Their findings seem to be in line with another hypothesis by DeepMind researchers that a complex environment, a suitable reward structure, and reinforcement learning can eventually lead to the emergence of all kinds of intelligent behaviors. AI and robotics research The DERL environment only has a fraction of the complexities of the real world. “Although DERL enables us to take a significant step forward in scaling the complexity of evolutionary environments, an important line of future work will involve designing more open-ended, physically realistic, and multiagent evolutionary environments,” the researchers wrote. In the future, the researchers plan to expand the range of evaluation tasks to better assess how the agents can enhance their ability to learn human-relevant behaviors. The work could have important implications for the future of AI and robotics and push researchers to use exploration methods that are much more similar to natural evolution. “We hope our work encourages further large-scale explorations of learning and evolution in other contexts to yield new scientific insights into the emergence of rapidly learnable intelligent behaviors, as well as new engineering advances in our ability to instantiate them in machines,” the researchers wrote. Ben Dickson is a software engineer and the founder of TechTalks. He writes about technology, business, and politics. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,688
2,021
"MLOps vs. DevOps: Why data makes it different | VentureBeat"
"https://venturebeat.com/ai/mlops-vs-devops-why-data-makes-it-different"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest MLOps vs. DevOps: Why data makes it different Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Much has been written about struggles of deploying machine learning projects to production. As with many burgeoning fields and disciplines, we don’t yet have a shared canonical infrastructure stack or best practices for developing and deploying data-intensive applications. This is both frustrating for companies that would prefer making ML an ordinary, fuss-free value-generating function like software engineering, as well as exciting for vendors who see the opportunity to create buzz around a new category of enterprise software. The new category is often called MLOps. While there isn’t an authoritative definition for the term, it shares its ethos with its predecessor, the DevOps movement in software engineering: By adopting well-defined processes, modern tooling, and automated workflows, we can streamline the process of moving from development to robust production deployments. This approach has worked well for software development, so it is reasonable to assume that it could address struggles related to deploying machine learning in production too. However, the concept is quite abstract. Just introducing a new term like MLOps doesn’t solve anything by itself; rather, it adds to the confusion. In this article, we want to dig deeper into the fundamentals of machine learning as an engineering discipline and outline answers to key questions: Why does ML need special treatment in the first place? Can’t we just fold it into existing DevOps best practices? What does a modern technology stack for streamlined ML processes look like? How can you start applying the stack in practice today? Why: Data makes it different All ML projects are software projects. If you peek under the hood of an ML-powered application, these days you will often find a repository of Python code. If you ask an engineer to show how they operate the application in production, they will likely show containers and operational dashboards — not unlike any other software service. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Since software engineers manage to build ordinary software without experiencing as much pain as their counterparts in the ML department, it begs the question: Should we just start treating ML projects as software engineering projects as usual, maybe educating ML practitioners about the existing best practices? Let’s start by considering the job of a non-ML software engineer: writing traditional software deals with well-defined, narrowly-scoped inputs, which the engineer can exhaustively and cleanly model in the code. In effect, the engineer designs and builds the world wherein the software operates. In contrast, a defining feature of ML-powered applications is that they are directly exposed to a large amount of messy, real-world data that is too complex to be understood and modeled by hand. This characteristic makes ML applications fundamentally different from traditional software. It has far-reaching implications as to how such applications should be developed and by whom: ML applications are directly exposed to the constantly changing real world through data, whereas traditional software operates in a simplified, static, abstract world that is directly constructed by the developer. ML apps need to be developed through cycles of experimentation. Due to the constant exposure to data, we don’t learn the behavior of ML apps through logical reasoning but through empirical observation. The skillset and the background of people building the applications gets realigned. While it is still effective to express applications in code, the emphasis shifts to data and experimentation — more akin to empirical science — rather than traditional software engineering. This approach is not novel. There is a decades-long tradition of data-centric programming. Developers who have been using data-centric IDEs, such as RStudio, Matlab, Jupyter Notebooks, or even Excel to model complex real-world phenomena, should find this paradigm familiar. However, these tools have been rather insular environments; they are great for prototyping but lacking when it comes to production use. To make ML applications production-ready from the beginning, developers must adhere to the same set of standards as all other production-grade software. This introduces further requirements: The scale of operations is often two orders of magnitude larger than in the earlier data-centric environments. Not only is data larger, but models — deep learning models in particular — are much larger than before. Modern ML applications need to be carefully orchestrated. With the dramatic increase in the complexity of apps, which can require dozens of interconnected steps, developers need better software paradigms, such as first-class DAGs. We need robust versioning for data, models, code, and preferably even the internal state of applications — think Git on steroids to answer inevitable questions: What changed? Why did something break? Who did what and when? How do two iterations compare? The applications must be integrated with the surrounding business systems so ideas can be tested and validated in the real world in a controlled manner. Two important trends collide in these lists. On the one hand we have the long tradition of data-centric programming; on the other hand, we face the needs of modern, large-scale business applications. Either paradigm is insufficient by itself — it would be ill-advised to suggest building a modern ML application in Excel. Similarly, it would be pointless to pretend that a data-intensive application resembles a run-of-the-mill microservice that can be built with the usual software toolchain consisting of, say, GitHub, Docker, and Kubernetes. We need a new path that allows the results of data-centric programming, models, and data science applications in general, to be deployed to modern production infrastructure, similar to how DevOps practices allows traditional software artifacts to be deployed to production continuously and reliably. Crucially, the new path is analogous but not equal to the existing DevOps path. What: The modern stack of ML infrastructure What kind of foundation would the modern ML application require? It should combine the best parts of modern production infrastructure to ensure robust deployments, as well as draw inspiration from data-centric programming to maximize productivity. While implementation details vary, the major infrastructural layers we’ve seen emerge are relatively uniform across a large number of projects. Let’s now take a tour of the various layers, to begin to map the territory. Along the way, we’ll provide illustrative examples. The intention behind the examples is not to be comprehensive (perhaps a fool’s errand, anyway!), but to reference concrete tooling used today in order to ground what could otherwise be a somewhat abstract exercise. Foundational infrastructure layers Data Data is at the core of any ML project, so data infrastructure is a foundational concern. ML use cases rarely dictate the master data management solution, so the ML stack needs to integrate with existing data warehouses. Cloud-based data warehouses, such as Snowflake , AWS’s portfolio of databases like RDS, Redshift, or Aurora , or an S3-based data lake , are a great match to ML use cases since they tend to be much more scalable than traditional databases, both in terms of the data set sizes and in terms of query patterns. Compute To make data useful, we must be able to conduct large-scale compute easily. Since the needs of data-intensive applications are diverse, it is useful to have a general-purpose compute layer that can handle different types of tasks, from IO-heavy data processing to training large models on GPUs. Besides variety, the number of tasks can be high too. Imagine a single workflow that trains a separate model for 200 countries in the world, running a hyperparameter search over 100 parameters for each model — the workflow yields 20,000 parallel tasks. Prior to the cloud, setting up and operating a cluster that can handle workloads like this would have been a major technical challenge. Today, a number of cloud-based, auto-scaling systems are easily available, such as AWS Batch. Kubernetes, a popular choice for general-purpose container orchestration, can be configured to work as a scalable batch compute layer, although the downside of its flexibility is increased complexity. Note that container orchestration for the compute layer is not to be confused with the workflow orchestration layer, which we will cover next. Orchestration The nature of computation is structured: We must be able to manage the complexity of applications by structuring them, for example, as a graph or a workflow that is orchestrated. The workflow orchestrator needs to perform a seemingly simple task: Given a workflow or DAG definition, execute the tasks defined by the graph in order using the compute layer. There are countless systems that can perform this task for small DAGs on a single server. However, as the workflow orchestrator plays a key role in ensuring that production workflows execute reliably, it makes sense to use a system that is both scalable and highly available, which leaves us with a few battle-hardened options — for instance Airflow , a popular open-source workflow orchestrator, Argo , a newer orchestrator that runs natively on Kubernetes, and managed solutions such as Google Cloud Composer and AWS Step Functions. Software development layers While these three foundational layers, data, compute, and orchestration, are technically all we need to execute ML applications at arbitrary scale, building and operating ML applications directly on top of these components would be like hacking software in assembly language — technically possible but inconvenient and unproductive. To make people productive, we need higher levels of abstraction. Enter the software development layers. Versioning ML app and software artifacts exist and evolve in a dynamic environment. To manage the dynamism, we can resort to taking snapshots that represent immutable points in time — of models, of data, of code, and of internal state. For this reason, we require a strong versioning layer. While Git , GitHub, and other similar tools for software version control work well for code and the usual workflows of software development, they are a bit clunky for tracking all experiments, models, and data. To plug this gap, frameworks like Metaflow or MLFlow provide a custom solution for versioning. Software architecture Next, we need to consider who builds these applications and how. They are often built by data scientists who are not software engineers or computer science majors by training. Arguably, high-level programming languages like Python are the most expressive and efficient ways that humankind has conceived to formally define complex processes. It is hard to imagine a better way to express non-trivial business logic and convert mathematical concepts into an executable form. However, not all Python code is equal. Python written in Jupyter notebooks following the tradition of data-centric programming is very different from Python used to implement a scalable web server. To make the data scientists maximally productive, we want to provide supporting software architecture in terms of APIs and libraries that allow them to focus on data, not on the machines. Data science layers With these five layers, we can present a highly productive, data-centric software interface that enables iterative development of large-scale data-intensive applications. However, none of these layers help with modeling and optimization. We cannot expect data scientists to write modeling frameworks like PyTorch or optimizers like Adam from scratch! Furthermore, there are steps that are needed to go from raw data to features required by models. Model operations When it comes to data science and modeling, we separate three concerns, starting from the most practical progressing towards the most theoretical. Assuming you have a model, how can you use it effectively? Perhaps you want to produce predictions in real-time or as a batch process. No matter what you do, you should monitor the quality of the results. Altogether, we can group these practical concerns in the model operations layer. There are many new tools in this space helping with various aspects of operations, including Seldon for model deployments, Weights and Biases for model monitoring, and TruEra for model explainability. Feature engineering Before you have a model, you have to decide how to feed it with labelled data. Managing the process of converting raw facts to features is a deep topic of its own, potentially involving feature encoders, feature stores, and so on. Producing labels is another, equally deep topic. You want to carefully manage consistency of data between training and predictions, as well as make sure that there’s no leakage of information when models are being trained and tested with historical data. We bucket these questions in the feature engineering layer. There’s an emerging space of ML-focused feature stores such as Tecton or labeling solutions like Scale and Snorkel. Feature stores aim to solve the challenge that many data scientists in an organization require similar data transformations and features for their work and labeling solutions deal with the very real challenges associated with hand labeling datasets. Model development Finally, at the very top of the stack we get to the question of mathematical modeling: What kind of modeling technique to use? What model architecture is most suitable for the task? How to parameterize the model? Fortunately, excellent off-the-shelf libraries like scikit-learn and PyTorch are available to help with model development. An overarching concern: Correctness and testing Regardless of the systems we use at each layer of the stack, we want to guarantee the correctness of results. In traditional software engineering we can do this by writing tests. For instance, a unit test can be used to check the behavior of a function with predetermined inputs. Since we know exactly how the function is implemented, we can convince ourselves through inductive reasoning that the function should work correctly, based on the correctness of a unit test. This process doesn’t work when the function, such as a model, is opaque to us. We must resort to black box testing — testing the behavior of the function with a wide range of inputs. Even worse, sophisticated ML applications can take a huge number of contextual data points as inputs, like the time of day, user’s past behavior, or device type into account, so an accurate test setup may need to become a full-fledged simulator. Since building an accurate simulator is a highly non-trivial challenge in itself, often it is easier to use a slice of the real-world as a simulator and A/B test the application in production against a known baseline. To make A/B testing possible, all layers of the stack should be able to run many versions of the application concurrently, so an arbitrary number of production-like deployments can be run simultaneously. This poses a challenge to many infrastructure tools of today, which have been designed for more rigid traditional software in mind. Besides infrastructure, effective A/B testing requires a control plane, a modern experimentation platform, such as StatSig. How: Wrapping the stack for maximum usability Imagine choosing a production-grade solution for each layer of the stack — for instance, Snowflake for data, Kubernetes for compute (container orchestration), and Argo for workflow orchestration. While each system does a good job at its own domain, it is not trivial to build a data-intensive application that has cross-cutting concerns touching all the foundational layers. In addition, you have to layer the higher-level concerns from versioning to model development on top of the already complex stack. It is not realistic to ask a data scientist to prototype quickly and deploy to production with confidence using such a contraption. Adding more YAML to cover cracks in the stack is not an adequate solution. Many data-centric environments of the previous generation, such as Excel and RStudio, really shine at maximizing usability and developer productivity. Optimally, we could wrap the production-grade infrastructure stack inside a developer-oriented user interface. Such an interface should allow the data scientist to focus on concerns that are most relevant for them, namely the topmost layers of stack, while abstracting away the foundational layers. The combination of a production-grade core and a user-friendly shell makes sure that ML applications can be prototyped rapidly, deployed to production, and brought back to the prototyping environment for continuous improvement. The iteration cycles should be measured in hours or days, not in months. Over the past five years, a number of such frameworks have started to emerge, both as commercial offerings as well as in open-source. Metaflow is an open-source framework, originally developed at Netflix, specifically designed to address this concern (disclosure: one of the authors works on Metaflow ). Google’s open-source Kubeflow addresses similar concerns, although with a more engineer-oriented approach. As a commercial product, Databricks provides a managed environment that combines data-centric notebooks with a proprietary production infrastructure. All cloud providers provide commercial solutions as well, such as AWS Sagemaker or Azure ML Studio. It is safe to say that all existing solutions still have room for improvement. Yet it seems inevitable that over the next five years the whole stack will mature, and the user experience will converge towards and eventually beyond the best data-centric IDEs. Businesses will learn how to create value with ML similar to traditional software engineering and empirical, data-driven development will take its place amongst other ubiquitous software development paradigms. Ville Tuulos is CEO and Cofounder of Outerbounds. He has worked as an ML researcher in academia and as a leader at a number of companies, including Netflix, where he led the ML infrastructure team that created Metaflow , an open-source framework for data science infrastructure. He is also the author of an upcoming book, Effective Data Science Infrastructure. Hugo Bowne-Anderson is a data scientist, educator, evangelist, content marketer, and data strategy consultant. He has worked at data science scaling company Coiled and has taught data science topics for data education platform DataCamp as well as at Yale University and Cold Spring Harbor Laboratory, conferences such as SciPy, PyCon, and ODSC, and at Data Carpentry. He previously hosted and produced the weekly data industry podcast DataFramed. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,689
2,021
"Meal-prep service Gobble is serving curated courses with a side of AI | VentureBeat"
"https://venturebeat.com/ai/meal-prep-service-gobble-is-serving-curated-courses-with-a-side-of-ai"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Meal-prep service Gobble is serving curated courses with a side of AI Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. The booming meal kit segment of the food industry — the global market is estimated to hit $11.6 billion by 2022 — is leading to a crowded landscape. Gobble , a meal-prep service founded in 2010, distinguishes itself by focusing on delivering family-friendly meals that can be cooked in just 15 minutes. Gobble applies its Sprout algorithm to the weekly menu curation and recommendation process, which evolves each member’s menu based on personal taste preferences learned over time. “What’s exciting about this AI application, in particular, is that we have the culinary expertise of our chefs as a ‘teacher’ guiding the AI in building your ‘personal chef,'” founder and CEO Ooshma Garg said. The result: “an expert machine learning algorithm that will understand you, [and] your expressed and revealed preferences, and tap into trends from the wider community to introduce you to a combination of both nostalgic and new recipes in an increasingly relevant way,” Garg said. Above: Ooshma Garg, founder and CEO of Gobble. New members begin by specifying their protein and diet preferences, along with their preferred nights per week and household size. From that moment onwards, every member interaction — including menu browsing, menu adjustments, additions of sides, and recipe review modules — helps Gobble develop an individual taste profile. Gobble sends a weekly survey to members asking what they would like to see on upcoming menus and requesting members rate dishes “hot or not” against each other. “Member engagement with questionnaires and reviews on Gobble is high as it directly influences their experience and menu the following week,” Garg said. Gobble’s recommendation algorithm also assesses similarities between new and existing members, albeit initially with limited data, to ensure a new member’s first few weeks based on their location, diet, and protein preferences are as appealing as possible. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Gobble also uses AI to plan ahead with greater acuity. The proprietary Fenix algorithm studies Gobble’s sales data from the last 7+ years, alongside external open data sets such as weather history and industry trends. With these data sets, the algorithm projects two key outputs: which members will place an order in any given week, and the distribution of sales among all dishes on the Gobble menu. “Multiple external variables intersect to affect week-to-week sales,” Garg pointed out, “anything from a snowstorm in a certain region to seasonal holiday behavior.” Fenix fits various models to the data and disentangles the component parts of historical member behavior to make sales predictions as accurate as possible for future weeks. The recommender model for food As many players in the field have seen, food recommendations do not abide by a simple formula. “There are actually a number of critical aspects to conquering this challenge,” Garg said. “We’ve all tried a dish that a friend ordered and loved it, even though we never would’ve thought to order it ourselves. So we are faced with several psychological ‘food blocks’ and, at times conflicting, food desires from every member. How do we suggest a dish so that it is comforting, but not boring? Adventurous, but not too risky? Then how do we ensure that the meal is actually enjoyed as much as the member expected?” Garg said. “Another interesting consideration is that while we have so many insights to potentially gather, we have a much smaller window to gather them,” Garg said. “Netflix can show you trailer after trailer, and their algorithms will get immediate gratification — a thumbs up or a thumbs down. A Gobble box of dinner kits is a much larger commitment, of both time and money, and a negative experience is much worse than simply watching a trailer for a movie you don’t like.” Food recommendations also deal with a longer timeline — consumers are fickle. “Maybe something you ordered from Gobble sounded good at the time, but when it comes to dinnertime the following week, maybe you’re no longer in the mood. It means we have to be that much more intelligent and effective in our recommendations before we show them to you,” Garg said. “These nuances are why we have invested so heavily in developing our own technologies. Nothing off-the-shelf has cracked the code in the food space to date.” Gobble has begun experimenting with AI to help ensure that meals look and sound as appetizing as possible. “This is something AI can do with far greater accuracy than any one person or one-off survey,” Garg said. “Gobble employs a ResNet based computer vision model and NLP vector embeddings to see if we can predict how appealing a given photo, title, or meal description is to our members. We pair this with an iterative and A/B testing approach to dish copy and creative, showing unique combinations to different member groups across the country to capture further learnings — much like Netflix’s approach to displaying different cover art in various placements.” A delicious future Beyond what customers may like to order for any given week, Gobble learns about each member as a person and can apply those learnings across all services. “We know discoverability exists on a spectrum; there is always a balance of promoting newness and expanding members’ comfort zones, while still providing familiar meals members will recognize and love again,” Garg said. It’s a similar challenge to Spotify’s “Discover Weekly” offerings, which are AI-crafted playlists that attempt to concoct the perfect mixture of your recent favorites, nostalgic hits, and a few tracks you’ve yet to discover. Gobble believes it can accomplish the same level of trust at dinnertime. Gobble’s AI approach demands more creativity from the team in how they gather data while ensuring members don’t get bogged down by too many surveys or questions. “We also know and accept that any learnings and algorithmic changes stemming from our recommendation data can have significant ramifications across our entire supply chain, from forecasting to prepped food to delivering the box to your door,” Garg said. Gobble’s end goal is to achieve an “auto-pilot” service for dinnertime. “Just like you probably leave a Spotify-curated playlist running while you commute, work, or exercise, Gobble can take care of your meals each week with increasing relevancy and minimal input from the user,” Garg said. “It’s a win-win; as our members enjoy reduced stress and less involvement in their menu planning, Gobble streamlines operations even further, experiencing less variability, greater predictability, and enhanced efficiency across the business.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,690
2,021
"DeepMind takes next step in robotics research | VentureBeat"
"https://venturebeat.com/ai/deepmind-takes-next-step-in-robotics-research"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages DeepMind takes next step in robotics research Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. DeepMind is mostly known for its work in deep reinforcement learning, especially in mastering complicated games and predicting protein structures. Now, it is taking its next step in robotics research. According to a blog post on DeepMind’s website, the company has acquired the rigid-body physics simulator MuJoCo and has made it freely available to the research community. MuJoCo is now one of several open-source platforms for training artificial intelligence agents used in robotics applications. Its free availability will have a positive impact on the work of scientists who are struggling with the costs of robotics research. It can also be an important factor for DeepMind’s future, both as a science lab seeking artificial general intelligence and as a business unit of one of the largest tech companies in the world. Simulating the real world Simulation platforms are a big deal in robotics. Training and testing robots in the real world is expensive and slow. Simulated environments, on the other hand, allow researchers to train multiple AI agents in parallel and at speeds that are much faster than real life. Today, most robotics research teams carry out the bulk of training their AI models in simulated environments. The trained models are then tested and further fine-tuned on real physical robots. The past few years have seen the launch of several simulation environments for reinforcement learning and robotics. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! MuJoCo, which stands for Multi-Joint Dynamics with Contact, is not the only game in town. There are other physics simulators such as PyBullet, Roboschool, and Isaac Gym. But what makes MuJoCo stand out from others is the fine-grained detail that has gone into simulating contact surfaces. MuJoCo performs a more accurate modeling of the laws of physics, which is shown in the emergence of physical phenomena such as Newton’s Cradle. MuJoCo also has built-in features that support the simulation of musculoskeletal models of humans and animals, which is especially important in bipedal and quadruped robots. The increased accuracy of the physics environment can help reduce the differences between the simulated environment and the real world. Called the “sim2real gap,” these differences cause a degradation in the performance of the AI models when they are transferred from simulation to the real world. A smaller sim2real gap reduces the need for adjustments in the physical world. Making MuJoCo available for free Before DeepMind open-sourced MuJuCo, many researchers were frustrated with its license costs and opted to use the free PyBullet platform. In 2017, OpenAI released Roboschool, a license-free alternative to MuJoCo, for Gym, its toolkit for training deep reinforcement learning models for robotics and other applications. “After we launched Gym, one issue we heard from many users was that the MuJoCo component required a paid license … Roboschool removes this constraint, letting everyone conduct research regardless of their budget,” OpenAI wrote in a blog post. A more recent paper by researchers in Cardiff University states that “The cost of a Mujoco institutional license is at least $3000 per year, which is often unaffordable for many small research teams, especially when a long-term project depends on it.” DeepMind’s blog refers to a recent article in PNAS that discusses the use of simulation in robotics. The authors recommend better support for the development of open-source simulation platforms and write, “A robust and feature-rich set of four or five simulation tools available in the open-source domain is critical to advancing the state of the art in robotics.” “In line with these aims, we’re committed to developing and maintaining MuJoCo as a free, open-source, community-driven project with best-in-class capabilities,” DeepMind’s blog post states. It is worth noting, however, that license fees account for a very small part of the costs of training AI models for robots. The computational costs of robotics research tend to rise along with the complexity of the application. MuJoCo only runs on CPUs, according to its documentation. It hasn’t been designed to leverage the power of GPUs, which have many more computation cores than traditional processors. A recent paper by researchers at the University of Toronto, Nvidia, and other organizations highlights the limits of simulation platforms that work on CPUs only. For example, Dactyl, a robotic hand developed by OpenAI, was trained on a compute cluster comprising around 30,000 CPU cores. These kinds of costs remain a challenge with CPU-based platforms such as MuJoCo. DeepMind’s view on intelligence DeepMind’s mission is to develop artificial general intelligence (AGI), the flexible kind of innate and learned problem-solving capabilities found in humans and animals. While the path to AGI (and whether we will ever reach it or not) is hotly debated among scientists, DeepMind has a clearly expressed view on it. In a paper published earlier this year, some of DeepMind’s top scientists suggested that “reward is enough” to reach AGI. According to DeepMind’s scientists, if you have a complex environment, a well-defined reward, and a good reinforcement learning algorithm, you can develop AI agents that will acquire the traits of general intelligence. Richard Sutton, who is among the co-authors of the paper, is one of the pioneers of reinforcement learning and describes it as “ the first computational theory of intelligence. ” The acquisition of MuJoCo can provide DeepMind with a powerful tool to test this hypothesis and gradually build on top of its results. By making it available to small research teams, DeepMind can also help nurture talent it will hire in the future. MuJoCo can also boost DeepMind’s efforts to turn in profits for its parent company, Alphabet. In 2020, the AI lab recorded its first profit after six years of sizable costs for Alphabet. DeepMind is already home to some of the brightest scientists in AI. And with autonomous mobile robots such as Boston Dynamics’ Spot slowly finding their market, DeepMind might be able to develop a business model that serves both its scientific goal and its owner’s interests. Ben Dickson is a software engineer and the founder of TechTalks. He writes about technology, business, and politics. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,691
2,021
"AI Weekly: The perils of AI analytics for police body cameras | VentureBeat"
"https://venturebeat.com/ai/ai-weekly-the-perils-of-ai-analytics-for-police-body-cameras"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI Weekly: The perils of AI analytics for police body cameras Share on Facebook Share on X Share on LinkedIn A Los Angeles Police Department officer wears a body camera at the Los Angeles Gay Pride Resist March, June 11, 2017 in Hollywood, California. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. In 2015, spurred by calls for greater police accountability, the federal government provided more than $23 million to local and tribal police agencies to expand their use of body cameras. As of 2016, 47% of the country’s roughly 15,300 general-purpose law enforcement agencies had purchased body cameras, according to a report by the Bureau of Justice Statistics, the most recent study measuring nationwide usage. Evidence on their efficacy is mixed — a recent comprehensive review of 70 studies of body camera use found that they had no consistent or statistically significant effects — but advocates assert that body cameras can deter bad behavior on the part of officers while reducing the number of citizen complaints. However, an outstanding technological challenge with body cameras is making sense of the amount of footage that they produce. As per one estimate , the average officer’s body camera will record about 32 files, 7 hours, and 20GB of video per month at 720p resolution. A relatively new startup, Truleo , claims to solve this problem with a platform that leverages AI to analyze body cam footage as it comes in. Truleo — which has raised $2.5 million in seed funding — converts the data into “actionable insights,” CEO and cofounder Anthony Tassone claims, using natural language processing and machine learning models to categorize incidents captured by the cameras. The Seattle Police Department is one of the company’s early customers. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Truleo analyzes the audio stream within body camera videos — we analyze the conversation between the officer and the public,” Tassone told VentureBeat via email. “We specifically highlight the ‘risky’ language the officer uses, which most often means surfacing directed profanity or using extremely rude language. However, we can also highlight officer shouting commands, so the command staff can evaluate the effectiveness of the subject compliance.” Potentially flawed AI Tassone says that Truleo’s AI models were built by its data scientists and law enforcement experts looking for “de-escalation, auto-flagging of incidents, or early warning for volatile interactions” to generate searchable reports. The models can recognize if a call is about drugs, theft, a foot chase, and if there’s profanity or shouting, he claims. Truleo quantifies the classifications as metrics, such as the percentage of “negative interactions” an officer has on a monthly basis and what police language is “effective.” “Obviously, a call that ends in an arrest is going to be negative. But what if an officer has an overwhelming amount of negative interactions but a below-average number of arrests? Is he or she going through something in their personal lives? Perhaps something deeply personal such as a divorce or maybe the officer was shot at last week. Maybe they need some time off to cool down or to be coached by more seasoned officers. We want to help command staff be more proactive about identifying risky behavior and improving customer service tactics — before the officer loses their job or ends up on the news.” But some experts are concerned about the platform’s potential for misuse, especially in the surveillance domain. “[Body cam] footage doesn’t just contain the attitude of the officer; it also contains all comments by the person they were interacting with, even when no crime was involved, and potentially conversations nearby,” University of Washington AI researcher Os Keyes told VentureBeat via email. “This is precisely the kind of thing that people were worried about when they warned about the implications of body cameras: police officers as moving surveillance cameras.” Above: Truleo’s analytics dashboard. Keyes also pointed out that natural language processing and sentiment analysis are far from perfect sciences. Aside from prototypes , AI systems struggle to recognize examples of sarcasm — particularly systems trained on text data alone. Natural language processing models can also exhibit prejudices along race, ethnic, and gender lines, for example associating “ Black-aligned English ” with higher toxicity or negative emotions like anger, fear, and sadness. Speech recognition systems like the kind used by Truleo, too, can be discriminatory. In a study commissioned by the Washington Post, popular smart speakers made by Google and Amazon were 30% less likely to understand non-American accents than those of native-born users. More recently, the Algorithmic Justice League’s Voice Erasure project found that that speech recognition systems from Apple, Amazon, Google, IBM, and Microsoft collectively achieve word error rates of 35% for African American voices versus 19% for white voices. “If it works, it’s dangerous. If it doesn’t work — which is far more likely — the very mechanism through which it is being developed and deployed is itself a reason to mistrust it, and the people using it,” Keyes said. According to Tassone, Truleo consulted with officials on police accountability boards to define what interactions should be identified by its models to generate reports. To preserve privacy, the platform converts footage into an MP3 audio file during the upstream process “in memory” and deletes the stream after analysis in AWS GovCloud , writing nothing to disk. “Truleo’s position is that this data 100% belongs to the police department,” Tassone added. “We aim to accurately transcribe about 90% of the audio file correctly … More importantly, we classify the event inside the audio correctly over 99% of the time … When customers look at their transcripts, if anything is incorrect, they can make those changes in our editor and submit them back to Truleo, which automatically trains new models with these error corrections.” When contacted for comment, Axon, one of the world’s largest producers of police body cameras, declined to comment on Truleo’s product but said: “Axon is always exploring technologies that have [the] potential for protecting lives and improving efficiency for our public safety customers. We gear towards developing responsible and ethical solutions that are reliable, secure, and privacy-preserving.” In a recent piece for Security Info Watch, Anthony Treviño, the former assistant chief of police for San Antonio, Texas and a Truleo advisor, argued that AI-powered body cam analytics platforms could be used as a teaching tool for law enforcement. “For example, if an agency learns through body camera audio analytics that a certain officer has a strong ability to de-escalate or control deadly force during volatile situations, the agency can use that individual as a resource to improve training across the entire force,” he wrote. Given AI’s flaws and studies showing that body cams don’t reduce police misconduct on their own, however, Treviño’s argument would appear to lack merit. “Interestingly, although their website includes a lot of statistics about time and cost savings, it doesn’t actually comment on whether it changes the outcomes in any way,” AI researcher at the Queen Mary University of London, Mike Cook, told VentureBeat via email. “Truleo claims they provide ‘human accuracy at scale’ — but if we already doubt the existing accuracy provided by the humans involved, what good is it to replicate it at scale? What good is a 50% reduction in litigation time if it leads to the same amount of unjust, racist, or wrongful police actions? A faster-working unfair system is still unfair.” For AI coverage, send news tips to Kyle Wiggers — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI channel, The Machine. Thanks for reading, Kyle Wiggers AI Staff Writer VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,692
2,021
"Nreal unveils lightweight Nreal Air AR glasses for entertainment | VentureBeat"
"https://venturebeat.com/technology/nreal-unveils-lightweight-nreal-air-ar-glasses-for-entertainment"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Nreal unveils lightweight Nreal Air AR glasses for entertainment Share on Facebook Share on X Share on LinkedIn Nreal Air weighs only 2.7 ounces. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Nreal is one of the few companies still standing from the early wave of augmented reality glasses makers. And today it is unveiling the second-generation Nreal Air glasses with big improvements. The goal of the Beijing-based company is to keep moving fast on the leading edge of the technology and create multiple generations of products so that it can be ready with the right one when the AR market takes off. The company is pitching the device as a “portable movie theater.” The newest glasses weigh just 2.7 ounces (77 grams), and they’re smaller and more compact than the Nreal Light glasses (100 grams) that the company launched in 2019, said Nreal CEO Chi Xu in an interview with GamesBeat. Xu said that advancements in AR technology have accelerated exponentially since Nreal Light was first unveiled. And that makes smaller and more compact augmented reality technology available. The company has also designed the new glasses for specific applications, such as streaming TV shows and playing mobile games. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Above: Nreal Air looks like a pair of sunglasses. “We got the Nreal light into hands in 2019, and it was like a concise version of the HoloLens,” Xu said. “We thought it would be great for productivity, but it turned out most of the usage was entertainment and media consumption.” The aim is to create an immersive experience in a form factor indistinguishable from daily worn sunglasses. Xu didn’t disclose the exact tech details, but he said the Nreal Air boasts the best display on the market available for any AR device, as it is capable of projecting up to a (simulated) 201-inch virtual display, when viewed at a distance of six meters, which is well suited for watching multimedia content. The glasses will still be tethered to your smartphone, just like the Nreal Light. But the glasses are about a third lighter and are more comfortable to wear than many previous AR glasses, Xu said. The glasses have an adjustable three-step rake system, which enables users to adjust the viewing angle by tilting the lens, and elastic temples that tightly hug the head and won’t slip. Nreal Air’s design has an outward-facing camera to focus on the theater experience. At 46 degrees, the glasses have a wider field of view than the previous glasses. Rivals include the Microsoft HoloLens, the Magic Leap One, and whatever mysterious object Apple is working on but never revealing. Impressive specs Above: Side view of the Nreal Air. The device has a micro-OLED display. Nreal Air’s display has a high density of colors, with as many as 49 pixels per degree (PPD). This ensures that fine details are clearly visible and enhances the realism of the content, Xu said. And it has a refresh rate of up to 90Hz, and the features don’t drain the phone’s battery. “We put a lot of resources into image quality and color,” Xu said. It is also compatible with Apple iOS devices, a rarity in that it will support both iPhones and iPads, and will also be compatible with most Android devices. Xu found that 78% of users in South Korea used Nreal Light to watch streaming content. Nreal will first launch Nreal Air in December 2021 in three Asian markets: Japan, China, and South Korea in partnership with leading carriers. Nreal Air’s roll out to other markets will continue through to 2022. Pricing will be determined by local carrier partners but will retail at a fraction of the price of Nreal Light. The company is planning to take the Nreal Light product into enterprises, while the Nreal Air will focus on consumers. The Nreal Light Developer Kit is available for order here. As one of the survivors, Nreal has been able to raise more than $185 million to date. The first glasses debuted in South Korea, Japan, and Europe. Xu said the company has more than 250 employees. Xu said he believes the headquarters in China is an advantage. “We are closer to the supply chain and we can move faster,” he said. “We go early and we go first. We can get the early market share.” GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,693
2,021
"Educational targets hit by rising cyberattacks in 2021 | VentureBeat"
"https://venturebeat.com/security/educational-targets-hit-by-rising-cyberattacks-in-2021"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Educational targets hit by rising cyberattacks in 2021 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Educational institutions are on pace for a record year of ransomware attacks in 2021, with K-12 schools the primary targets. While contributing to better educational outcomes, successful one-device-per-student and learn-from-anywhere programs have expanded the attack surface for cyber threats of various kinds. Bad actors prioritize elementary schools because they’re underfunded when it comes to cybersecurity staff and systems, and administrators are often impatient to put attacks behind them and resume classes. According to Sophos’ “The State of Ransomware in Education 2021,” the typical educational institution pays an average $112,435 ransom payment to get data back and networks running again. In addition, bad actors encrypt the personal identities and financial data of students, parents, and administrators as part of ransomware attacks, at times threatening to publicly release such data to further pressure victims into paying the ransom. Crucial information on the cyber threats to education also comes via an Absolute Software “21/22 Endpoint Risk Report: Education Edition” that found the total number of devices deployed across K-12 environments increased 74% from 2019 to 2020. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Absolute’s research is noteworthy because it quantifies how the disruption caused by digital learning, including new technology adoption, opened new attack vectors for bad actors and cybercriminals. Another fascinating aspect of the study is how school districts’ IT and cybersecurity teams are being pulled in multiple directions as they strive to secure the identities of their students, teachers, and administrators. Digital learning needs self-healing endpoints to scale globally, and protecting edge-based endpoints is just the first step. Secure endpoints save school networks The Absolute study makes it clear that one-device-per-student strategies are a challenge for IT teams. Federal, state, and local government funding of learn-from-anywhere programs has worked, according to school district CIOs VentureBeat spoke with. Funds were immediately allocated to at-risk children who didn’t have internet access or devices at home to stay connected with their schools. In many cases, Google Chromebooks have dominated new device adoption. CIOs told VentureBeat the ability to lock down selective Chromebooks that are at-risk endpoints is a must-have feature as their online student populations grow. Meanwhile, devices of all kinds can challenge administrators, especially if the devices are overloaded with applications. Overloading endpoints with too much software makes them less secure. A typical school’s endpoint device has 5.4 security controls per device — including VPN, antivirus, and anti‑malware — compared to 11.7 security apps per device on a typical corporate endpoint device. School and enterprise endpoint devices are already crowded with software client conflicts and decay that leave endpoints vulnerable. Conflicting endpoints make IT management and audits particularly challenging. Beyond antivirus apps Every new app deployed on an endpoint device increases the chance of it falling victim to cyber threats. In short, endpoints continue to be weakened by too many conflicting software agents, ineffective antivirus applications, and OS patches that are long out of date. Absolute’s study found that just 53% of antivirus applications are working effectively today and almost one-third of educational devices studied contained sensitive data. Nearly 50% house social security data, and 39% contain protected health information. Above: Despite high endpoint management installation rates in education settings, devices continue to fall prey to bad actors’ cyber attacks. Source: Absolute Software. Absolute’s report shows the success of learn-from-anywhere strategies and other initiatives that dominate learning institutions’ IT and cybersecurity spending hinges on achieving complete visibility and control over all student and staff devices. Absolute’s study quantifies just how geographically distributed distance learning is and how endpoint devices are especially vulnerable when off a school network. Truly self-healing endpoints are the solution to this challenge. The most reliable, scalable, and persistent self-healing endpoints deliver continuous visibility, control, and intelligence across devices, data, and applications. Absolute’s approach of being embedded in the firmware of over half a billion Windows devices and extendable to Chrome OS and iOS has proven itself across educational institutions globally. “In this new digital reality, the endpoint is now the edge and the primary attack surface for cybercriminals is actually in the hands of students and staff,” Absolute Software president and CEO Christy Wyatt told VentureBeat. “The ability to see, manage, and protect every endpoint device — as well as the data and applications on those devices — is critical in ensuring students and staff remain safe, connected, and productive no matter where physical learning is taking place,” she continued. Absolute’s study accurately captures the complexity and urgency of IT and cybersecurity teams’ challenges in ensuring learn-from-anywhere initiatives can continue to serve students, teachers, and administrators. In addition, CIOs at school districts VentureBeat spoke with say Unified Endpoint Management platforms need to deliver on the promise of greater visibility and control because IT teams will have to provide audits as part of next year’s budgeting process. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,694
2,021
"Super Monkey Ball: Banana Mania review -- More fun than a ... you know | VentureBeat"
"https://venturebeat.com/pc-gaming/super-monkey-ball-banana-mania-review-more-fun-than-a-you-know"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Review Super Monkey Ball: Banana Mania review — More fun than a … you know Share on Facebook Share on X Share on LinkedIn He's chunky and ready to party. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. First off, spoiler alert, I’m giving Super Monkey Ball: Banana Mania a 5/5. This may seem silly to some of you. Yes, this is in many ways a simple game. You have a monkey. They’re in a ball. And you have to tilt that ball toward the goal. This is also a remaster. Well, sort of. Banana Mania contains all of the levels and minigames from the first three Super Monkey Ball games. But many of them have seen changes (don’t worry, you can unlock the original versions too). The format is also different here. Before, you had to beat the campaign before running out of lives. Like with many modern games, the lives concept is gone here now. If you die, you can keep trying. So, yeah. I’m giving a 5/5 to a remastered compilation of Super Monkey Ball. Banana Mania is some of the most fun I’ve had gaming this year. If you pick this one up when it launches on October 5 for Switch, PlayStation 4, Xbox One, and PC, I bet you’ll have a similar experience. By the way, I played on Switch. If you’re worried about getting Banana Mania on that platform because of Sega’s lackluster port job with Sonic Colors: Ultimate on the system, don’t be. Banana Mania runs great on Switch and runs at a smooth 60 fps (as long as you’re not playing split screen multiplayer — then the framerate expectedly drops a bit). Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Above: Levels can get pretty wild. I’m a (banana) maniac OK, so why do I love Banana Mania so much? First off, Super Monkey Ball is fun. It was fun back when I first played it as a GameCube launch game. It’s still fun in 2021. It’s a simple concept that allows for some creative, challenging levels. Sometimes you’re just navigating your monkey through a relatively simple obstacle course. Other times you have to grasp more unique concepts or gimmicks, like using different switches to speed up, reverse, or pause spinning platforms so you can cross safely. These levels can be challenging, but with some patience and thoughtfulness, you’ll eventually conquer them. And now that lives aren’t a thing, you don’t have to worry about too many failures forcing a game over and making you replay levels you already beat. Sure, sometimes you’ll come across a stage that you’ll get stuck on for what can feel like hours. But, dang, it sure does feel good when you finally cross the goal line on those ones. I’m also a big fan of any game with fun unlockables, and Banana Mania delivers here. You earn points by playing, and you can use these to earn new characters, costumes, and even game modes. This is how you can unlock other Sega stars, like Sonic and Tails, as playable characters. You can even purchase a jump button that you can use in the single player content. None of these goodies are too expensive, so you’ll be able to unlock what you want just by playing a decent bit of the main game. Above: Sonic is in Monkey Ball! Party like it’s 2002 If you ever play Super Monkey Ball before, you know that party games are often the best part of the package. Banana Mania comes with 12 of these multiplayer minigames. This includes Monkey Fight, which has four players rolling around and trying to punch each other off a platform while battling over upgrades. You also have Monkey Racing, a surprisingly decent kart racing-like experience. Monkey Billiards, Monkey Tennis, and Monkey Bowling are exactly what you expect, and somehow better. I mean, Monkey Bowling is pretty much everything I need from any bowling game. And then you have Monkey Target, probably the greatest minigame of all time. In this one, up to four players roll down a hill, shoot off a ramp, and float their way to far off targets. Then they try to land on zones that are worth the most points. It’s like competitive Pilot Wings. Again, this sounds simple, but it’s stupid how fun this is with friends. In one memorable game, me and three buddies were shooting for a long strip of land. The furthest, smallest part of this island just before the edge was worth the most points. The first one of us to reach this area deployed too late and just landed on the platform flat on his monkey stomach. Then the next person reached the second-best score zone. Not bad. And then I flew in and knocked him off. We both fell into the scoreless ocean. We were all howling, and then the last of us soared down and landed on the last inch of the section that awarded the prime points. At this point, we were all off the couch and jumping and screaming with excitement. These minigames are just fantastic to play with friends in the same room. And it’s easy to swap between them. You can play a round of bowling, then do a bit on Monkey Target, and maybe finish off with a single race or a bout of Monkey Fight. The variety is incredible and can make an evening of gaming with friends pass quickly. Monkey around Yes, Banana Mania is essentially just the kind of Super Monkey Ball game that you’d expect. I just didn’t realize how much I’ve missed this series. The single player levels present a challenging and creative experience, while the minigame offerings make this collection essential to anyone who likes to play local multiplayer. This is the best Super Monkey Ball has ever been. Super Monkey Ball: Banana Mania is out on October 5 for Switch, PlayStation 4, Xbox One, and PC. Sega gave us a Switch code for this review. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,695
2,021
"Sony acquires remake masters Bluepoint Games | VentureBeat"
"https://venturebeat.com/pc-gaming/sony-acquires-remake-masters-bluepoint-games"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sony acquires remake masters Bluepoint Games Share on Facebook Share on X Share on LinkedIn How can you lose when you have a fire sword? Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. Sony announced today that it has added Bluepoint Games to its PlayStation Studios family of first-party developers. In the blog post announcing the acquisition, head of PlayStation Studios Herman Hulst said, “With each of its projects, Bluepoint has raised the bar on console-defining visuals and gameplay, and the studio’s vast expertise in world building and character creation will be a huge plus for future PlayStation Studios properties.” Bluepoint has worked with Sony on remakes of games like Shadow of the Colossus and Demon’s Souls. The two have formed a close partnership, so this isn’t shocking news. “Austin, Texas has been home base for Bluepoint from when we first founded the studio back in 2006 and we’re now a team of close to 70 super-talented creatives and growing,” Bluepoint Games president Marco Thrush notes in the announcement. “While the studio has certainly grown over the past 15 years, our cultural beliefs have remained the same — to always push the envelope and create the highest-quality games possible all while having fun doing it. The focus on culture has been instrumental to our success, and we’re excited that PlayStation Studios shares a similar culture and vision.” Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! While Bluepoint made its name with remakes, Thrush told IGN that its next project will be an original game. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,696
2,021
"BlueStacks launches free cloud gaming service for mobile games | VentureBeat"
"https://venturebeat.com/mobile/bluestacks-launches-free-cloud-gaming-service-for-mobile-games"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages BlueStacks launches free cloud gaming service for mobile games Share on Facebook Share on X Share on LinkedIn BlueStacks X is a new mobile cloud gaming service. Are you looking to showcase your brand in front of the brightest minds of the gaming industry? Consider getting a custom GamesBeat sponsorship. Learn more. BlueStacks made its name getting Android games to run on the PC. And now it is releasing BlueStacks X , which is a free cloud-based gaming streaming service for mobile games. The Palo Alto, California-based company describes BlueStacks X as the world’s first cloud-based game streaming service for mobile titles. BlueStacks X is available on Windows 10 and 11, Mac, iOS, Android, Chromebook, and Raspberry Pi. It is the only cloud gaming service on the market that offers free streaming for mobile games across platforms and devices. BlueStacks X is powered by hybrid cloud technology, built in partnership with Now.gg , BlueStacks’ sister company. Hybrid cloud enables the cloud to offload parts of computation and graphics rendering to the endpoints, dramatically reducing the cloud costs and enabling users to enjoy a free service, said Rosen Sharma, CEO of BlueStacks, in an interview with GamesBeat. “We are using BlueStacks X to offload some of the computation to your endpoint, as that can help get the cost of the cloud down and keep the service free,” Sharma said. Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! The BlueStacks X service has a free tier that is ad-based, and it will have a premium offering later on, Sharma said. Better cloud costs Above: BlueStacks X has more than 200 games available for free. The problem with cloud gaming is that providers have to use a tremendous amount of cloud servers to stream data at a high speed to users’ devices. The computing happens in the cloud and not on the devices. This enables the convenience for users of being able to use low-end PCs, consoles, or smartphones to play high-end games that are computed in the servers. But this gets to be pretty expensive, particularly if there’s a high cost for streaming to mobile devices over a carrier’s network. BlueStacks X is different because it taps the power of the devices — however weak it is — to handle some of the computing chore so that not all of the data has to be streamed and computed in the cloud. This is what Sharma means by lowering the costs, and that enables BlueStacks to offer BlueStacks X as a free service, he said. “Our endpoint is capable of doing some work,” he said. “So whether it be graphics or compute the cloud, we offload some of the work to the endpoint. It brings down the big cloud cost a lot. So that’s the key innovation that enables that to be a free service, in addition to using the cloud.” Above: Rosen Sharma is CEO of BlueStacks. The strategy of both BlueStacks and Now.gg is different from the strategies of rivals like Microsoft’s xCloud, Nvidia’s GeForce Now, Amazon’s Luna, and Google’s Stadia, where the focus is taking PC and console games and getting them to run on the cloud so they can be accessible on any device, including mobile. None of those is really direct competition, Sharma said. The BlueStacks X service can use a native client and browsers capable of native graphics rendering. This technology works transparently and does not require any integration from game developers. “Browsers such as Chrome now have native graphics processing unit (GPU) support, and the Open GL graphics support in the browsers has gotten much better over the years,” Sharma said. “So if you’re using PC browsers, it has become much much easier to do mobile browsers.” It does take some deep tech to partition the different pieces of a game so that the device works on part of it and the cloud works on the rest, Sharma said. It has to be smart enough to calculate whether the network is fast enough or if more computation has to be done on the local device. In another development, the company is enabling Discord users to integrate cloud game service into their Discord servers. That means you can play games with friends on the Discord server, which is good for creators to get better engagement with fans. BlueStacks App Player Above: BlueStacks on Discord BlueStack’s early product, the BlueStacks App Player, recently crossed a billion lifetime downloads. BlueStacks X is a natural next step for the company, as the hybrid cloud makes it economically viable to launch the service. That early business has been growing organically, and it enabled the company to work on BlueStacks X. BlueStacks X can be accessed via the mobile browser on iOS, Android, Windows 11, Mac, Chromebooks and even some smart TVs. The BlueStacks X native client is available on Windows 11, Windows 10, and older versions of Windows. BlueStacks X can also be used by BlueStacks App Player users, and it is available for free in 14 countries. BlueStacks X (beta) already has over 200 games, and several new games are being added every week. The service has a great collection of role-playing games and strategy games with other genres being added over time. The company has more than 400 people. Gamers in 100 countries and six continents player more than six billion gaming sessions with 70,000 different games in 2020 on BlueStacks. GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings. Join the GamesBeat community! Enjoy access to special events, private newsletters and more. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,697
2,021
"Tasq.ai promises faster data annotation for AI development | VentureBeat"
"https://venturebeat.com/data-infrastructure/tasq-ai-promising-faster-data-annotation-for-ai-development-raises-4m"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Exclusive Tasq.ai promises faster data annotation for AI development Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Israel-based Tasq.ai , which says it has found a much faster way for companies to embark upon data annotation for AI development, today announced it has raised $4 million in seed funding. Data annotation or labeling is one of the most important aspects of building a successful and scalable AI/ML project. This work provides the initial setup to train a machine learning model on what it needs to understand and analyze to come up with accurate outputs. Many companies rely on small internal teams or business process outsourcing to get their dataset annotated for training. A growing number of other startups are also offering to annotate data, including Snorkel AI , SuperAnnotate , and Labelbox. But Tasq.ai claims it offers 30 times faster data labeling for AI than current methods by combining ML models and proprietary technology to “intelligently deconstruct complex image data.” Once the data is broken into simple “micro-tasks,” it’s gamified to leverage what the company says is an untapped, unbiased global human workforce of millions to label and validate data. The company says it can offer unlimited scale without compromising the quality of the dataset or introducing biases. “We’re bringing the usage model that Amazon pioneered for cloud storage to data annotation for AI. It’s going to completely upend the way AI is built and eliminate the data bottlenecks that are slowing progress,” Tasq.ai cofounder and CEO Erez Moscovich said in a statement. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Lightning-fast data annotation for AI projects Tasqers (annotators) responsible for validating results are only shown relevant portions of images and are asked whether the image they are looking at contains a particular object, the company says on its website. The Tasqers’ multiple judgments are validated, weighted, rated, and aggregated into a structured schema of actionable insights. The platform ropes in annotators through partnerships with leading ad networks that help identify talent and provide access to premium content when they have completed identification tasks. Tasq.ai then uses sophisticated algorithms to train, qualify, test, and monitor these digital workers. The service is currently available on a usage-based pricing model. Why this is hot right now Data annotation is a hot area of investment because it remains a challenge for so many companies. Data labeling often comes with high operational costs, as well as inflexibility, bias, and inaccuracy on the part of human annotators. Humans are also slow. These challenges can affect the performance and behavior of the AI or other model in question. The investment in the two-year-old company was led by a clutch of angel investors, including Wix’s former AI head, professor Shai Dekel. The company said it will use the investment to expand its international presence and open sales offices in New York and Chicago. It also plans to accelerate R&D efforts in Israel, according to a statement. Tasq.ai has already handled data annotation projects for companies like Here, Intel, FruitSpec, SuperSmart, and VHive. Its computer vision solution can be applied to a range of areas, from autonomous vehicles and drones to ecommerce, agriculture, and media. “Everyone knows that AI capabilities are a must-have, but only those of us who have built AI companies and products understand the extent of the massive data annotation bottleneck issue that Tasq.ai is the first to solve,” Dekel said in the statement. “They’re alone at the forefront of the data annotation field, and that’s a tremendous achievement and advantage, not to mention a big leap forward for the development of AI. Tasq.ai’s success means expanding access to the ability to quickly build great AI and more effective applications that will be a boon to businesses and users alike,” he added. According to a PwC study, AI is expected to contribute $15.7 trillion to the global economy by 2030. Leading this growth are China and North America, which will drive the greatest economic gains at $10.7 trillion. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,698
2,021
"Synthetic DNA startup Catalog raises $35M to speed up computation | VentureBeat"
"https://venturebeat.com/data-infrastructure/synthetic-dna-startup-catalog-raises-35m-to-speed-up-computation"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Synthetic DNA startup Catalog raises $35M to speed up computation Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Catalog , a Boston, Massachusetts-based company developing what it claims is the world’s first DNA computer, today announced that it raised $35 million in a series B funding round led by Hanwha Impact Partners. The company says that it plans to use the capital to launch its chemical-based computing platform by next year, where data management and computation occur through the manipulation of synthetic DNA. As conversations continue on the electricity traditional computing requires to process large volumes of data, interest in chemical-based DNA computing systems is gaining momentum. While DNA computers tend to have slow processing speeds and produce answers that can be difficult to analyze, they can make a high amount of parallel computations owing to the millions to billions of molecules within that interact with each other simultaneously. “DNA-based computing means everything happens at a chemical level, where traditional boundaries between storage, memory, and computation are blurred and often nonexistent. This allows for levels of parallelism previously unimaginable while spending far less energy than previously possible,” Catalog cofounder and CEO Hyunjun Park told VentureBeat via email. Base technology DNA computing is an emerging branch of computation that leverages DNA, biochemistry, and molecular hardware instead of traditional electronics. Leonard Adleman at the University of Southern California kickstarted the field in 1994, when he demonstrated a proof-of-concept use of DNA as a form of computing hardware. In 1995, the idea for DNA-based memory was originated by Eric Baum, who hypothesized that a vast of amount data could be stored in a tiny amount of DNA due to its ultra-high density. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Traditional computers use a series of logic gates — made up of transistors, usually — that transform different data inputs into outputs. But with DNA, molecules can be triggered to bind with each other to create a circuit of logic gates in test tubes. Catalog was founded in 2016 by two MIT scientists with the goal of developing a commercially viable product for DNA-based storage, which Park claims is up to 1 million times more dense than today’s current storage technologies. In working over the past year with companies in IT, media and entertainment, and energy, Catalog’s research scientists developed an automated system for DNA storage, which they assert is the first not confined to wet lab chemistry. A hard drive’s typical storage ratio is about 30 million gigabytes per cubic meter. Catalog’s method can store 600 billion gigabytes in the same volume. The company developed an encoding scheme and combinatorial approach that allows for what it claims are “dramatic” cost reductions and throughput, making DNA-based storage and computation economically viable. Potential The potential applications of DNA computers span security, cryptography, memories, disks, and robotics. Researchers have also shown in early-stage work how DNA computers might be used for extremely accurate detection of certain cancers. “Catalog has discovered broad applicability of our platform across industry sectors, as well as nearly universal demand for what DNA-based computing promises among heavy data users,” Park added. “Early [use cases] that we are able to speak about at the moment include digital signal processing, such as seismic processing in the energy sector, and data base comparisons such as fraud protection and identity management in the financial industry.” Gartner predicted in a recent report that by 2024, 30% of digital businesses will mandate DNA storage trials –addressing the exponential growth of data poised to overwhelm existing storage technology. “Technologies are being stretched to their limits,” Gartner VP analyst Daryl Plummer, the lead author of the research, said during a recent presentation at the virtual Gartner IT Symposium/Xpo 2020. “Nontraditional approaches will enable the next rebound of innovation and efficiency … All of human knowledge could be stored in a small amount of synthetic DNA.” Of course, Catalog isn’t the only team investigating DNA computer technologies. A partnership between IBM and Caltech was established in 2009 aimed at putting “DNA chips” into production, and a California Institute of Technology group is working on the manufacturing of nucleic-acid-based integrated circuits that can compute whole square roots. Beyond this, last year, Microsoft, Twist Bioscience, Illumina, and Western Digital formed the DNA Storage Alliance , of which Catalog is also a part. But Park believes that Catalog is well-positioned to compete in the nascent market. To date, the 20-person company has raised more than $45 million in venture capital from backers including Horizon Ventures and Airbus Ventures. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,699
2,021
"Streaming database platform provider Materialize lands $60M | VentureBeat"
"https://venturebeat.com/data-infrastructure/streaming-database-platform-provider-materialize-lands-60m"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Streaming database platform provider Materialize lands $60M Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Materialize, a company developing a streaming structured query language (SQL) database platform, today announced that it raised $60 million in series C funding, bringing the company’s total raised to more than $100 million. Redpoint Ventures contributed the capital with participation from Kleiner Perkins, Lightspeed Venture Partners, and others, and cofounder and CEO Arjun Narayan says that it’ll be used to grow Materialize’s engineering team and bring its cloud service from beta to general availability. Real-time data analytics can benefit companies across finance, retail, ecommerce, and other industries. For example, banks can identify fraudulent transactions while minimizing false positives, and ecommerce sites can provide better personalization via recommendations. But real-time data analytics often requires compromises between cost, speed, and features. For example, it’s difficult to achieve millisecond response times for queries without building custom microservices. Founded in 2019, Materialize — whose team includes early employees of Dropbox, Stripe, and YouTube — offers a standard SQL interface for streaming data so that companies can build queries without the need for engineers with specialized skills. The platform computes and incrementally maintains data as it’s generated, so that query results are accessible the moment that they’re needed. “Frank McSherry and I founded Materialize in February 2019 after realizing the implications of his timely and differential dataflow research in providing ‘true’ real-time data streaming. We’ve been studying this problem for decades, and Frank in particular spent years doing the hard science that allows developers to write complex queries for streaming data using standard SQL,” Narayan told VentureBeat via email. “We have the mentality that all businesses should have access to the power of accurate streaming data without tradeoffs. Although other data streaming solutions have been around for years, each one of them requires some sort of compromise.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Materialized views McSherry and Narayan named Materialize after the database concept of “materialized views.” In databases, materializing views refers to the act of precomputing the results for a query so that they’re instantly available when needed — rather than doing the work on-demand and waiting for the computation to finish. “Materialized views that are always fresh have long been prohibitively expensive in traditional database systems, and Materialize makes them cheap and always-ready on all of a company’s streaming data,” Narayan said. “We’ve seen our early customers use Materialize for real-time data visualization, financial modeling, and to advance various software-as-a-service applications in marketing tech, logistics, and enterprise resource planning.” While Materialize isn’t an engine for machine learning or AI itself, Narayan notes that it can play a role in the data pipelines that feed into machine learning models. Some companies, including Datalot, have investigated using Materialize as a “streaming feature store,” a class of tool used to store commonly used features in models. Above: Materialize can be used to feed AI and machine learning pipelines, as shown in this schematic. “Current solutions offer a linear tradeoff between speed and cost. If you want to move more quickly, you simply have to pay for it,” Narayan said. “We look to break this pattern by offering extremely low latency computation, but on a much more efficient scale through standard SQL.” Materialize says that in six months, it’s grown its developer community to over 970 people and attracted brands including Density and Kepler Cheuvreux. This month, the startup, which has close to 60 employees, plans to open its headquarters in Slack’s previous New York City office. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,700
2,021
"Speedata's chip for analytics workloads gets a $55M boost | VentureBeat"
"https://venturebeat.com/data-infrastructure/speedatas-chip-for-analytics-workloads-gets-a-55m-boost"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Speedata’s chip for analytics workloads gets a $55M boost Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Speedata , a semiconductor infrastructure company developing hardware accelerators for analytics and databases, today announced that it raised $55 million in a series A round led by Walden Catalyst Ventures, 83North, and Koch Disruptive Technologies, with participation from Pitango First, Viola Ventures, and individual investors. The company says that the new money will be used to fund Speedata’s go-to-market strategy for its analytics and databases unit (APU), a processor designed to speed up data-heavy workloads. It’s projected that the amount of data created in the next three years will exceed the amount created in the past 30 years, while revenue from analytics will grow to nearly $70 billion by 2025. Database analytics can be an asset for enterprises, cloud providers, private datacenters, and others across industries. But most cloud providers use processors or accelerators based on field-programmable gate array (FPGAs) for analytics applications, which can run up against computation limitations. Jonathan Friedmann, Dan Charash, Rafi Shalom, and Itai Incze partnered with academic researchers Yoav Etsion and Dani Voitsechov of Technion to cofound Speedata, which aims to build hardware that accelerates business intelligence and data extraction, transformation, and load applications. The company claims a server rack with its APU can replace multiple racks of processors, reducing both costs and energy consumption. As of 2019, the adoption rate of big data analytics stood at 52.5% among organizations, with a further 38% saying that they intend to use the technology in the future. A 2019 survey by Enterprenuer.com found that enterprises implementing big data analytics have seen a profit increase of 8% to 10%. Assuming that the current trend holds, global big data and business analytics revenue will grow to $215.7 billion in 2021, according to Statista. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “The company’s focus is on analytics and databases, which is represents an enormous workload at enterprises. It is important to note that Speedata is unique in that it focuses on analytics and databases, while the vast majority of other processor startups focus on AI and machine learning. These are very different workloads and they require very different processors,” Friedmann told VentureBeat via email. “[Our] main competitors are the major chip manufacturers, including CPU vendors and other ARM-based CPU manufacturers. Other companies are also eyeing the database and analytics market with other types of chips such as GPUs, but CPUs are still utilized in over 99% of the database market to date.” APU Friedmann says that Speedata’s work builds on six years of research at the Israeli Institute of Technology, focusing on creating a new type of processor architecture. Unlike current CPUs, GPUs, and ARM-based chips, the APU is “non-Von-Neuman,” which means it lacks a central processor with an arithmetic unit, a control unit, memory, mass storage, and input and output. “Speedata’s scalable architecture addresses the main bottlenecks of analytics, including I/O, compute, and memory, effectively accelerating all three. This unique architecture is compatible with all legacy software, allowing for seamless migration of workloads, with no changes necessary to an enterprise’s code or existing framework,” a spokesperson for Speedata said. Speedata isn’t the first to market with a processor designed for big data and analytics. Recently, Nvidia released the BlueField-3 data processing unit (DPU) , which packs software-defined networking, storage, and cybersecurity acceleration capabilities. Oracle’s SPARC M7 chip has a data analytics accelerator coprocessor with a specialized set of instructions for data transformation. And startup Blueshift Memory claims to have developed a chip that can perform analytics workloads up to 1,000 times faster than standard processors. But Friedmann claims that Speedata’s platform is unique in that it “includes strong networking components” as well as the ability to integrate with existing software in datacenters. While the company is pre-revenue, he says it’s reached the “advanced development stage” and currently has 40 employees, with a planned expansion in headcount over the coming months. “Speedata has consolidated multiple technology streams. First and foremost, the introduction of AI into the datacenter industry catalysed a fundamental change in the infrastructure of the field. We at Speedata understood that datacenters are undergoing a revolution and that the future of datacenter compute will be dominated by dedicated accelerators,” Friedmann said. “The pandemic has accelerated the digital transformation of almost all industries, which has fueled an explosion of big data and analytics and databases workloads, dramatically increasing the need for Speedata’s solution.” The latest funding round brings Speedata’s total amount raised to $70 million, including a previously undisclosed $15 million seed round led by Viola and Pitango. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,701
2,021
"Manage your Apple devices from your Windows computer with this $9 device management solution | VentureBeat"
"https://venturebeat.com/commerce/manage-your-apple-devices-from-your-windows-computer-with-this-9-device-management-solution"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Deals Manage your Apple devices from your Windows computer with this $9 device management solution Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. It’s a phenomenon that Apple loyalists are far too familiar with: every time the company introduces a new slate of products, your current device mysteriously malfunctions. It’s not a conspiracy—not really. Planned obsolescence is a real thing, and it’s not only Apple that’s guilty of such. But the Cupertino tech giant does it in a more deliberate manner, resulting in lawsuits that fine them for slowing down their own devices like iPhones. With the company having just introduced new iPhones , users are already complaining about how their old phones are suddenly going kaput. If you happen to be one of them and are looking to upgrade, you have to prepare for the painstaking process of moving data from one device to another. You won’t encounter that problem with iTools Premium. It’s a tool that lets you transfer music and export photos in two ways between your Apple device and computer without any risk of losing data when syncing your iPhone from the computer. It allows for seamless transfer of files, as well as backing up and restoring of data. You can get it on sale for an extra 40% off with the code VIP40. Another underrated feature of iTools is location spoofing, where you trick your device into telling apps that you’re located somewhere you’re not. It’s especially useful in cases like gaming, where GPS is an important factor. With iTools, you can simulate your preferred location to your mobile devices through your desktop with no trouble. Data extraction and migration shouldn’t be hard. With iTools, you can manage your iPhone and iPads from a Windows computer sans the hassle. A lifetime license usually costs $30, but if you key in the code VIP40 at checkout, you can enjoy a massive discount and get it for only $8.99. VentureBeat Deals is a partnership between VentureBeat and StackCommerce. This post does not constitute editorial endorsement. If you have any questions about the products you see here or previous purchases, please contact StackCommerce support here. Prices subject to change. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,702
2,020
"VCs invested over $75B in AI startups in 2020 | VentureBeat"
"https://venturebeat.com/business/vcs-invested-over-75b-in-ai-startups-in-2020"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VCs invested over $75B in AI startups in 2020 Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Investments in AI are growing at an accelerated pace, according to a new report from the Organization for Economic Cooperation and Development (OECD). The Paris, France-based group found that the U.S. and China lead the growing wave of funding, taking in a combined 81% of the total amount invested in AI startups last year, while the European Union and U.K. boosted their backing but lag substantially behind. “The venture capitalist (VC) sector tends to forerun general investment trends, indicating the AI industry is maturing. As the AI industry matures, the median amount per investment is growing, there are more very large investments and proportionately fewer investment deals at early stages of financing,” the report reads. OECD’s study analyzed VC rounds in 8,300 AI companies worldwide, covering transactions between 2012 and 2020 that were documented by capital market analysis firm Preqin. According to the findings, the global annual value of VC investments in AI startups grew from $3 billion in 2012 to nearly $75 billion in 2020. Funding increased 20% last year alone, with startups based in the U.S. and China nabbing over 80% of all investments in 2020. The European Union followed with 4%, trailed by the U.K. and Israel at 3%. The report also found that growth in AI investment in U.S.-based firms has been steady since 2012, reaching $42 billion in 2020. Chinese companies experienced a spike in 2017 and 2018, followed by a slump in 2019, and represented $17 billion in 2020. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Companies developing driverless vehicles and mobility technologies attracted the most investment of all AI companies, drawing $19 billion in VC money during 2020 and $95 billion from 2012 to 2020. The second-biggest segment was health care, drugs, and biotechnology, which raked in 16% of the 2020 investment total. VC funding rounds in these related industries doubled from $6 billion in 2019 to $12 billion in 2020 — most likely as a result of the pandemic. AI business processes and support service startups ranked third in VC investments in 2020, meanwhile — also likely due to the pandemic, which motivated digital transformations and remote and hybrid work arrangements. Potential and risks The outsized investment in autonomous vehicles reflects the belief among investors that AI has the potential to address worker shortfalls in transportation. According to the American Trucking Associations (ATA), the sector was short 60,800 drivers in 2018. If the shortage is left unchecked, ATA expects it to swell to more than 160,000 drivers nationally by 2028. In a worrisome sign, the U.K. was forced to recruit the army to drive fuel trucks to gas stations, owing to a shortage of available drivers. Momentum in the life sciences field is less steady, with Deloitte reporting that health care organizations vary significantly in their AI investments. But the enterprise has embraced AI with open arms, leveraging it to automate costly back-office and customer-facing tasks. Over a quarter of all AI business initiatives are already in production and more than a third are in the advanced development stages, an IDC survey found. And just over half of businesses said they would spend $500,000 to $5 million on AI initiatives in 2021, up from 34% in 2020, according to Appen. But these sectors face challenges as AI systems come under greater scrutiny. While 22.7% of employees feel AI will start to have a large impact on their industry within the next 1 to 2 years, 54% are either moderately or very concerned that AI will negatively disrupt their job, according to a 2021 Reign survey. AI isn’t a silver bullet, moreover — as research reveals. In a recent report , only 10% of company managers reported significant financial benefits from their AI investments. And an MIT taskforce predicts technologies like fully autonomous cars won’t arrive for at least 10 years. Runway As an expanding cohort looks to cash in on the continued AI investment boom, OECD’s report presents evidence that there’s plenty of runway. That’s despite the fact that some startups are duplicitous about their use of AI technologies. In a 2019 study by MMC Ventures, 40% of purported AI startups in Europe — 2,830 — were found to not use any AI in their products. A Forbes piece notes that over the past decade, total funding and the average round size for AI companies have risen at a reliable pace. In 2010, the average early-stage round for AI or machine learning startups was about $4.8 million. In 2017, total funding increased to $11.7 million for first-round early-stage funding — a more than 200% uptick. And in Q2 2021, AI startups attracted a record of more than $20 billion in funding, despite a drop in deal volume. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,703
2,021
"Software security groups increased use of open source tech by 61% over 2 years | VentureBeat"
"https://venturebeat.com/business/software-security-groups-increased-open-source-tech-by-61-over-2-years"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Software security groups increased use of open source tech by 61% over 2 years Share on Facebook Share on X Share on LinkedIn Man and 2 laptop screen with program code. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. BSIMM12 data indicates a 61% increase in software security groups’ identification and management of open source over the past two years, almost certainly due to the prevalence of open source components in modern software and the rise of attacks using popular open projects as vectors. The growth in activities related to cloud platforms and container technologies show the dramatic impact these technologies have had on how organizations use and secure software. For example, Building Security In Maturity Model (better known as BSIMM) made only five observations of “use orchestration for containers and virtualized environments” in BSIMM10, while it made 33 observations two years later for BSIMM12 — an increase of 560%. Another emerging trend observed in the BSIMM12 research is that businesses are learning how to translate risk into numbers. Organizations are exerting more effort to collect and publish their software security initiative data, demonstrated by a 30% increase of the “publish data about software security internally” activity over the past 24 months. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! BSIMM12 data also shows an increase in capabilities focused on inventorying software; creating a software bill of materials (BOM); understanding how the software was built, configured, and deployed; and the organization’s ability to redeploy based on security telemetry. Demonstrating that many organizations have taken to heart the need for a comprehensive up-to-date software BOM, the BSIMM activity related to those capabilities — “enhance application inventory with operations bill of materials” — increased from 3 to 14 observations over the past two years, a 367% increase. The move from maintaining traditional operational inventories toward automated asset discovery and creating bills of material includes adding “shift everywhere” activities such as using containers to enforce security controls, orchestration, and scanning infrastructure as code. BSIMM has grown from nine participating companies in 2008 to 128 in 2021, with now nearly 3,000 software security group members and over 6,000 satellite members (aka “security champions”). This 2021 edition of the BSIMM report — BSIMM12 — examines anonymized data from the software security activities of 128 organizations across various verticals, including financial services, FinTech, independent software vendors, IoT, healthcare, and technology organizations. Read the full report by BSIMM. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,704
2,021
"Salesforce and Atlassian double down on developer security with $75M Snyk investment | VentureBeat"
"https://venturebeat.com/business/salesforce-and-atlassian-double-down-on-developer-security-with-75m-snyk-investment"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Salesforce and Atlassian double down on developer security with $75M Snyk investment Share on Facebook Share on X Share on LinkedIn Snyk logo Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Let the OSS Enterprise newsletter guide your open source journey! Sign up here. Snyk , the company behind an open source security scanning platform, has extended its series F round of funding by another $75 million. The Boston-headquartered company announced a $530 million investment just a few weeks back at a whopping $8.5 billion valuation. The transaction included both primary and secondary investments, meaning that Snyk had in fact only raised around $300 million in fresh capital. For the extension, which closes the series F round off at $605 million, Snyk has attracted return investments from the venture capital arms of Atlassian and Salesforce, which are now responsible for 10% of Snyk’s $850 million total raised since its inception. And for the record, Snyk is now valued at $8.6 billion. By way of a brief recap, Snyk’s SaaS platform helps developers find and fix vulnerabilities — as well as surface license violations — in their open source codebases, containers, and Kubernetes applications. Founded initially out of London and Tel Aviv in 2015, Snyk has amassed an impressive roster of customers in its six-year history, including Google, Salesforce, Intuit, and Atlassian. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Strategic Atlassian’s follow-on investment in Snyk is particularly notable, as it comes shortly after Snyk announced a slew of integrations with Bitbucket Cloud and Atlassian Open DevOps, suggesting that Atlassian’s continued backing is as much a strategic move as it is anything else. “Snyk is reinventing the way organizations think about security,” Atlassian’s head of corporate development Chris Hecht said in a statement. “They are a vital part of our ecosystem, tightly integrated into our core products.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,705
2,021
"Intel unveils second-generation neuromorphic computing chip | VentureBeat"
"https://venturebeat.com/business/intel-unveils-second-generation-neuromorphic-computing-chip"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Intel unveils second-generation neuromorphic computing chip Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Intel today announced a major update to its neuromorphic computing program, including a second-generation chip called Loihi 2 and Lava, an open source framework for developing “neuro-inspired” applications. The company is now offering two Loihi 2-based neuromorphic systems — Oheo Gulch and Kapoho Point. They will be available through a cloud service to members of the Intel Neuromorphic Research Community (INRC) and Lava via GitHub for free. Along with Intel, researchers at IBM, HP, MIT, Purdue, and Stanford hope to leverage neuromorphic computing — circuits that mimic the human nervous system’s biology — to develop supercomputers 1,000 times more powerful than any today. Custom-designed neuromorphic chips excel at constraint satisfaction problems, which require evaluating a large number of potential solutions to identify the one or few that satisfy specific constraints. They’ve also been shown to rapidly identify the shortest paths in graphs and perform approximate image searches, as well as mathematically optimizing specific objectives over time in real-world optimization problems. Intel recently demonstrated that the chips can be used to “teach” an AI model to distinguish between 10 different scents , control a robotic assistive arm for wheelchairs , and power touch-sensing robotic “skin.” Loihi 2 and Lava Intel says Loihi 2 incorporates learnings from three years of use with the first-generation Loihi chip, building on progress in Intel’s process technology and asynchronous design methods. Packing up to 1 million “neurons” per chip, Loihi 2, which is fabricated with a pre-production version of the Intel 4 process , is up to 10 times faster than Loihi, with 15 times greater resource density and better energy efficiency. VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Early tests show a 60 times reduction in operations per inference on Loihi 2 compared to standard AI models running on Loihi — without a loss in accuracy — according to Intel neuromorphic computing director Mike Davies. “Loihi 2 … harvest[s] insights from several years of collaborative research using Loihi,” Davies added in a statement. “Our second-generation chip greatly improves the speed, programmability, and capacity of neuromorphic processing, broadening its usages in power- and latency-constrained intelligent computing applications.” Loihi 2 offers greater programmability, canvassing a wider class of algorithmic problems, including real-time optimization and planning. The chip also improves compatibility with backpropagation and other foundational AI techniques, expanding the scope of algorithms supported by its low-power form factor. Fully programmable “neuron models” and generalized “spike messaging” in Loihi 2 open the door to newly trainable machine learning models, while Ethernet interfaces, integrations with event-based vision sensors, and larger meshed networks of Loihi 2 chips enable deployments on robots, as well as conventional motherboards. As for Lava, Intel says it addresses the need for a common software framework in the neuromorphic research community. Lava allows researchers and application developers to converge on a common set of tools, methods, and libraries, with software that runs on both conventional and neuromorphic processors and interoperates with existing AI, neuromorphic, and robotics tools. Using Lava, developers can build neuromorphic applications without access to specialized hardware and contribute to the Lava code base, for example porting it to run on other platforms. Applications INRC, an ecosystem of over 150 academic groups, government labs, research institutions, and companies, was founded in 2018 to further neuromorphic computing. It claims to have achieved breakthroughs in applying neuromorphic hardware to an array of applications, from voice recognition to autonomous drone navigation. Some members of INRC see business use cases for chips like Loihi. For example, Lenovo, Logitech, Mercedes-Benz, and Prophesee hope to apply this technology to enable things like more efficient and adaptive robotics and rapid search of databases for similar content. Last year, Accenture tested the ability to recognize voice commands on Loihi versus a standard graphics card and found the chip was up to 1,000 times more energy-efficient and responded up to 200 milliseconds faster. Mercedes-Benz is exploring how Accenture’s results could apply to real-world scenarios, such as adding new voice commands to in-vehicle infotainment systems. Meanwhile, other Intel partners are investigating how Loihi could be used in products like interactive smart homes and touchless displays. In October 2020, Intel inked a three-year agreement with Sandia National Laboratories to explore the value of neuromorphic computing for scaled-up AI problems as part of the U.S. Department of Energy’s (DOE) Advanced Scientific Computing Research program. In related news, the company recently entered into an agreement with Argonne National Laboratory to develop and design microelectronics technologies such as exascale, neuromorphic, and quantum computing. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,706
2,021
"Identity decision platform Alloy adds $100M to its funding pool | VentureBeat"
"https://venturebeat.com/business/identity-decision-platform-alloy-adds-100m-to-its-funding-pool"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Identity decision platform Alloy adds $100M to its funding pool Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Identity-decisioning platform Alloy announced today that it received $100 million in funding to boost its valuation to unicorn status, $1.35 billion. Lightspeed Venture Partners’ Justin Overdorff led the round with participation from existing investors Canapi Ventures, Bessemer Venture Partners, Avid Ventures and Felicis Ventures, bringing the total amount raised to over $150 million. The identity decision platform Alloy plans on using the new capital to expand its offerings, which help fintechs and banks to safely onboard customers and make subsequent identity-related decisions about them. Decision-making about applicants for accounts or loans can be difficult simply because of the sheer number of data points financial institutions have to consider. Alloy simplifies the process. It gives customers a single API that can connect to multiple (120) data sources related to decisioning — address and banking authentication, credit, email and phone data, fraud records are just a few. Customers can use these sources as data points to create tailored workflows in a low-code, no-code way. “We automate the process of figuring out which data sources and what rules you apply works best for your use case, your population, and your risk tolerance,” said Laura Spiekerman, co-founder and chief revenue officer at Alloy. Because of Alloy’s extensive experience with the ways financial fraud manifests, it can also advise clients on customizing workflows depending on their ongoing risk tolerance. Decreasing bias in decision-making Financial services might be ripe for bias but Alloy, which is not an AI-driven platform, works on leveling the playing field for applications by expanding the number of data points for application evaluation, Spiekerman said. Most traditional decision-making conducted at banks is linear and not holistic, Spiekerman pointed out. Such a straitjacketed approach automatically eliminates otherwise legitimate applications. For example, insisting that applicants have a credit profile with one of the three U.S. credit bureaus (Experian, Equifax or TransUnion) before they can be given a loan. Such a process can be limiting for new immigrants, who might not have an established credit history in the United States. “Instead we can look at cash flow data, transaction history, which opens the doors to a broader population,” Spiekerman said. Similarly, public database records are skewed toward wealthier, more established demographics, so alternative data like utility bill records can level the playing field better, Spiekerman said. “We believe that at the fundamental level, bringing in more data can really help in eliminating bias,” she added. An ongoing customer profile While Alloy started out with automating identity and fraud compliance during the early onboarding process, it has since moved to helping customers with ongoing decisioning processes, Spiekerman said. Decision-making in financial institutions is not limited to early-stage onboarding. “There are tons of decisions to be made post-onboarding: should we let a customer send $5000 through an online transaction, change an email address etc. which requires additional information,” Spiekerman said. It is why Alloy creates an evolving customer profile that moves beyond onboarding checks. “You need the original information about the user and you need their transaction history, what you have learned along the way and network data,” Spiekerman said. Alloy is a platform that “allows you to see all the behaviors, all the identity information you can collect about your users, which gets richer over time,” Spiekerman said. Such holistic information, a mix of static and transactional data, leads to better decisions and outcomes for applicants. Alloy’s biggest competition is the move to build in-house versus buy. Showing value by helping companies get to market faster works to the company’s advantages. Alloy counts both small and large fintechs and banks in their roster, including Ally Bank, Gemini and Ramp. In the last year, the company has seen Annual Recurring Revenue (ARR) more than triple and headcount increase by 140%. Alloy currently services over 200 clients. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,707
2,021
"GitHub brings centralized, granular controls to enterprise user accounts | VentureBeat"
"https://venturebeat.com/business/github-brings-centralized-granular-controls-to-enterprise-user-accounts"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages GitHub brings centralized, granular controls to enterprise user accounts Share on Facebook Share on X Share on LinkedIn A GitHub logo seen displayed on a smartphone. Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. Let the OSS Enterprise newsletter guide your open source journey! Sign up here. GitHub has formally launched Enterprise Managed Users (EMUs), a new type of user account for GitHub Enterprise Cloud (GHEC) customers that can be provisioned and managed centrally via the company’s identity provider (IdP). This represents part of GitHub’s broader efforts to transition software development away from local environments and into the cloud, the most obvious other example being its browser-based Codespaces platform, which it recently launched for enterprises. GitHub’s EMUs, which were first announced in private beta last year , give admins granular control over GitHub accounts across the company by tying GitHub Enterprise Cloud to their IdP of choice, such as Google, Microsoft, or Okta. This means admins can create (or suspend) user accounts for their employees via a linked IdP and manage user profile data (e.g. username, display name, or email address) and GitHub teams membership. Above: GitHub EMU Prior to now, GitHub Enterprise Cloud customers were able to invite developers into company groups using their own existing individual accounts, but with EMUs, companies can now create user accounts for their employees centrally specifically for use at work — these accounts can only be used on repositories that belong to the company. This is particularly notable from a security perspective, as repositories associated with EMU accounts are automatically blocked from making private code publicly visible, which goes some way toward averting human error. And because accounts support single sign-on (SSO) and are synchronized with a company’s corporate ID, it’s always clear who is collaborating — this is perhaps more important in the current global landscape, given that million of employees are working remotely and may never actually meet their collaborators face to face. For today’s launch, GitHub’s EMUs only support identity services from Microsoft’s AzureAD and Okta, though plans are afoot to extend coverage to additional IdPs in the future. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,708
2,021
"FEELM Reinforces its Commitment to Innovations in Atomization Technology at 2021 UK Media Day | VentureBeat"
"https://venturebeat.com/business/feelm-reinforces-its-commitment-to-innovations-in-atomization-technology-at-2021-uk-media-day"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release FEELM Reinforces its Commitment to Innovations in Atomization Technology at 2021 UK Media Day Share on Facebook Share on X Share on LinkedIn SHENZHEN, China–(BUSINESS WIRE)–September 30, 2021– SMOORE, a global leader in offering vaping technology solutions, today announced that its flagship atomization tech brand, FEELM, hosts 2021 UK Media Day at Silverstone Circuit. At the Media Day, Chief Scientist of SMOORE, Dr. Shi Zhiqiang, has reinforced FEELM’s commitment to innovations in atomization technology and affirmed that atomization coil is the key to the vape product performance and vaping experience. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20210930005566/en/ FEELM 2021 UK Media Day (Photo: Business Wire) As the flagship tech brand belonging to SMOORE, FEELM specializes in the development and manufacturing of high-quality closed pod vapes driven by its innovative FEELM coil. FEELM coil technology features instant and high efficiency of atomization, smooth and pure taste until the last puff, smooth and delicate vapor, authentic flavor, and high nicotine delivery efficiency. In 2019, FEELM launched the “FEELM Inside” global technology authentication trademark, as a symbol of excellent vaping experience for users. Today, the “FEELM inside” symbol is on the closed system pods of market-leading vaping brands around the world, such as RELX, NJOY, HAKA, HEXA, and VAPO. “With the global launch of FEELM in 2016, FEELM has a significant impact on the research and manufacturing of closed vaping products, changing the global vaping industry landscape. The ‘FEELM inside’ symbol today is widely recognized by our partners and consumers as a certification of our commitment to quality, authenticity and high-quality vaping experience,” said Dr. Zhiqiang Shi, Chief Scientist of SMOORE. “Our Group has also made satisfying inroads in the research and development of medical and healthcare vaping products, with more than 1,000 scientists and experts from different backgrounds.” Moreover, FEELM has adopted the industry’s first fully automated closed pod production line in 2019, realizing zero-staff inspection of product quality. To date, the productivity of a single line has reached a record high of 7,200 standard vaporizers per hour, and the annual production capacity of FEELM has surpassed three billion pieces. Committed to SMOORE’s mission of “atomization making life better,” FEELM has launched the world’s first taste evaluation model in December 2020. It covers four dimensions and 51 indexes, establishing a scientific system to evaluate vaping tastes. Moving forward, FEELM will also develop a Global Vape Flavor Map, by virtue of its global consumer base and extensive research on consumer behaviors. As the world’s largest vaping device manufacturer in terms of revenue, SMOORE has accounted for 18.9% of the total global market share in 2020, according to Frost & Sullivan. About FEELM: FEELM is a flagship atomization technology brand belonging to SMOORE. Focusing on the research of cutting-edge atomization technology, FEELM also specializes in the development and manufacturing of high-quality atomization devices driven by FEELM coil. The “FEELM inside” symbol is on the closed system pods of a number of global leading tobacco companies and vaping companies around the world. About SMOORE: SMOORE is a global leader in offering vaping technology solutions, including manufacturing vaping devices, and vaping components, with advanced R&D technology, strong manufacturing capacity, wide-spectrum product portfolio and diverse customer base. The Company is the world’s largest vaping device manufacturer in terms of revenue, accounting for 18.9% of the total global market share. View source version on businesswire.com: https://www.businesswire.com/news/home/20210930005566/en/ Frankie Chen [email protected] (+86) 13530848319 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,709
2,021
"Facing the new cybersecurity challenges of hybrid work (VB Live) | VentureBeat"
"https://venturebeat.com/business/facing-the-new-cybersecurity-challenges-of-hybrid-work-vb"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Live Facing the new cybersecurity challenges of hybrid work (VB Live) Share on Facebook Share on X Share on LinkedIn Person working from home Presented by Field Effect Increasingly, employees are continuing to work remotely — and increasing your company’s security risks. Protecting your networks, endpoints, and more is a whole new game now. Join this VB Live event to learn about the new security risks and how to meet those challenges. Register here for free. The hybrid model is taking over workplaces. Employees are increasingly being encouraged to take advantage of cloud-based technologies to work from anywhere, while coming into office whenever necessary. Initially driven by the pandemic, this mixing of remote and office work has proven to increase employee productivity and wellbeing, and many companies are going all-in. In fact, a CNBC survey of top executives in human resources, finance, and technology found that 45% planned to use a hybrid work model for the second half of 2021. With 83% of employers calling the shift to remote work a success for their company, there are clearly a tremendous number of benefits. But there’s also an enormous risk: No matter the size of your business, these work environments are creating brand new cybersecurity challenges because of a rapidly changing threat surface, cyber security that hasn’t kept up, and a distracted workforce. The cybersecurity danger In the rush to put remote work infrastructure in place while concerns about the pandemic grew, the security of these measures was often a second-place consideration. Unfortunately, cyber crime has spiked 600% since the start of COVID-19. About 30% of organizations have seen a spike in the number of cyber attack attempts since the start of the pandemic. And last year, 61% of malware sent to companies targeted remote workers through their cloud applications. That number is expected to grow. With significantly less oversight and control of how and where employees are connecting, 54% of IT workers are feeling concerned about the harm from future cyber attacks. Employees who are working entirely remotely can have their connections firewalled away from the rest of the company system, keeping the network safe. However, hybrid workers pose a threat every time they walk into the office and reconnect their laptop to the network, along with any malware they’ve inadvertently been subjected to. In that way, cybercriminals potentially gain access to your users, your network, and your company’s data. Securing these access points, tracking malicious activity, eliminating intruders, and repairing the damage these cyber criminals creates is an enormous, ever-moving challenge for IT departments, and puts your hybrid work model at risk. That risk will mount as hackers get more confident, their methods of hacking grow in sophistication, and the number of targets grows, along with the number of vulnerabilities inadvertently offered up for exploitation. Protecting your organization For many organizations, it makes business sense to commit to the hybrid workplace for the long term. That also means that it makes business sense to evaluate where your current company setup works, and what needs to change, in order to protect your company and your workers no matter where they’re doing their jobs. Here’s a look at the some of your biggest priorities when establishing your new work order. It starts with developing a cyber security policy and ensuring all in the company are fully briefed and on board. So much of cybersecurity depends on users being aware of how and where cybercriminals can attack, and what leaves them, and their devices, open to attack. It’s essential to educate your employees on the importance of staying aware of the danger, and how to mitigate it, including identifying and responding appropriately to social engineering and phishing attempts, choosing strong passwords, physically protecting their devices, and more. And there’s much more. To learn more about the cybersecurity risks that come with hybrid work, and how companies can effectively protect their networks, cloud services, and endpoints from today’s most sophisticated attackers, don’t miss this VB Live event. Don’t miss out! Register here for free. You’ll learn: The biggest cyber risks associated with hybrid environments Emerging threats for start-ups, scale-ups, and mid-sized businesses Steps for creating a secure infrastructure for new work norms How to mitigate risk and maximize defense (even if your IT is outsourced) Speakers: William H. Dutton, Oxford Martin Fellow, Global Cyber Security Capacity Centre (GCSCC), University of Oxford Andrew Milne , Chief Revenue Officer, Field Effect Ernie Sherman , President, Fuelled Networks Seth Colaner, Moderator, VentureBeat The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,710
2,021
"EML Completes the Acquisition of Sentenial and Enters the Open Banking Market In Europe | VentureBeat"
"https://venturebeat.com/business/eml-completes-the-acquisition-of-sentenial-and-enters-the-open-banking-market-in-europe"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Press Release EML Completes the Acquisition of Sentenial and Enters the Open Banking Market In Europe Share on Facebook Share on X Share on LinkedIn EML heralds its flagship entry into open banking with the innovative Nuapay brand through the strategic acquisition of Sentenial Group BRISBANE, Australia–(BUSINESS WIRE)–September 30, 2021– EML Payments’ (ASX:EML) (S&P/ASX 200) much-anticipated acquisition of Sentenial Limited and its wholly-owned subsidiaries (“Sentenial”) has been approved by the French and U.K. financial regulators; L’Autorité de contrôle prudentiel et de résolution (ACPR) and the Financial Conduct Authority (FCA). EML is acquiring Sentenial, including its open banking product suite, Nuapay, for an upfront enterprise value of €70 million (A$108.6 million), plus an earn-out component of up to €40 million (A$62.1 million). This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20210929006014/en/ EML seals its entry into open banking with Nuapay. (Photo: Business Wire) The acquisition’s finalisation repositions EML as a global leader in open banking and card payments and one of the largest fintech enablers. Sentenial’s open banking and account-2-account (“A2A”) brand Nuapay will complement EML’s digital product suite starting in Europe and the United Kingdom. Together, EML Nuapay will forge a frictionless payments revolution leveraging the open banking movement, empowering a better experience for end-users and better economics for clients. The EML product platform provides tailored payment solutions across a wide range of industry verticals. With the introduction of the Nuapay product suite, clients can take full advantage of the latest in payments innovation. “We’ve been working with the Sentenial team for some months now on a go-to-market strategy encompassing sales, marketing, product and technology, and we’ll showcase all at our upcoming EMLCON4 event in November. Strategically, this is an important acquisition for us, continuing our transition over the last ten years from a gift card company, to a General Purpose Reloadable company, and now extending that again to include digital payments, open banking and account-2-account payments, giving us a unique set of capabilities in the European market. We’re just as excited about welcoming the whole team from Sentenial into the EML family, and once travel opens up, getting over to meet them all in person,” explained Tom Cregan, Managing Director & Group CEO at EML. “Today’s milestone brings considerable investment into our further growth. We’re delighted to bring these advantages to our partners and build upon our long-established reputation as their dependable first choice provider for open banking and account-2-account payments,” said Sean Fitzgerald, Founder & CEO at Sentenial. Industry Facts : As the open data wave arrives, as much as $416 billion in revenue will be at stake – Accenture(1) Instant payments will exceed $27.7 trillion in 2026 , from just $4.8 trillion in 2021. Extraordinary growth of over 470% – Juniper Research(2) The number of open banking users worldwide is expected to grow at an average annual rate of nearly 50% between 2020 and 2024, with the European market being the largest. In 2020, Europe counted approximately 12.2 million open banking users. This figure is expected to reach 63.8 million by 2024. As of 2020, 24.7 million individuals worldwide used open banking services, a number that is forecast to reach 132.2 million by 2024 – Statista(3) In the U.K., the number of API calls made by Third Party Providers (TPPs) using Account Servicing Payment Service Providers’ (ASPSP) open banking APIs increased from 1.9 million monthly calls as of June 2018 to 694.4 million successful monthly calls in December 2020 – Statista(4) 839.9 million successful API calls made to account providers’ (ASPSPs) Open Banking APIs in July 2021 – The Open Banking Implementation Entity (Open Banking Limited)(5) Discover more about the Nuapay product suite by visiting EML’s newly relaunched website: EMLPayments.com Early registration for EMLCON4 has just opened at: EMLCONglobal.com About EML Payments EML provides an innovative payment solutions platform, helping businesses all over the world create awesome customer experiences. Wherever money is in motion, our agile technology can power the payment process, so money can be moved quickly, conveniently and securely. We offer market-leading programme management and highly skilled payments expertise to create customisable feature-rich solutions for businesses, brands and their customers. Come and explore the many opportunities our platform has to offer by visiting us at: EMLPayments.com References: https://www.accenture.com/us-en/insights/banking/open-banking-moving-towards-open-data-economy https://www.juniperresearch.com/researchstore/fintech-payments/instant-payments?utm_campaign=pr1_instantpayments_financial_fintech_aug21&utm_source=businesswire&utm_medium=pr https://www.statista.com/statistics/1228771/open-banking-users-worldwide/ https://www.statista.com/statistics/1212259/successful-calls-made-by-tpps-using-account-providers-on-banking-apis-united-kingdom/ https://www.openbanking.org.uk/ View source version on businesswire.com: https://www.businesswire.com/news/home/20210929006014/en/ Sarah Bowles, Group Chief Digital Officer EML Payments Limited (ASX:EML) [email protected] +61 439 730 968 Marie O’Riordan, Global Director of Public Relations EML Payments Limited (ASX:EML) [email protected] / [email protected] +44 207 183 5856 / +353 46 94 2010 9 VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,711
2,021
"Contract management startup ContractPodAI nabs $115M for AI-driven legal review | VentureBeat"
"https://venturebeat.com/business/contract-management-startup-contractpodai-nabs-115m-for-ai-driven-legal-review"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Contract management startup ContractPodAI nabs $115M for AI-driven legal review Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here. ContractPodAI , an AI-powered contract management solution provider, today announced that it raised $115 million in a series C investment led by SoftBank Vision Fund 2 at “five times” its valuation compared with July 2019. The round, which saw participation from Eagle Proprietary Investments, will be put toward product growth and expanding ContractPodAI’s presence internationally, leveraging SoftBank’s Asia-Pacific network. Almost every business function relies on legal involvement or expertise. Despite its importance, legal has been one of the last functions to adopt digitization. As of 2017, the cost of a basic contract stood at $6,900, according to the International Association for Contract & Commercial Management — with around 5 billable hours spent on legal review and up to 18 hours of management and procurement time. Above: ContractPodAI’s product dashboard. Founded in 2012 and based in London, ContractPodAI is the brainchild of Sarvarth Misra, a lawyer-turned-entrepreneur who sought to digitize legal tools and resources by leveraging off-the-shelf AI technologies. ContractPodAI pairs public cloud services from IBM, Microsoft, and others with a no-code interface designed to help teams tackle claims, request-for-proposal reviews, and intellectual property portfolio management using prebuilt and configurable apps. “The company was founded by Robert Glennie and I, who are both corporate lawyers by background. We recognized the huge market opportunity from legal’s slower adoption of technology,” Misra said in a press release. “It was a question of ‘when,’ not ‘if.’ But, there was no existing technology truly fit for purpose, so they built ContractPodAI using legal design thinking which has today evolved to a no-code fully configurable platform for legal teams across their day to day work.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Injecting contracts with AI As the pandemic disrupted businesses around the world, investors bet the farm on legal solutions, which they predicted would become increasingly digitized. According to Crunchbase , legal tech companies have already seen more than $1 billion in venture capital investments so far in 2021, smashing the $510 million invested in 2020 and the all-time high of $989 million in 2019. The contract management software market alone is expected to climb from $1.5 billion in worth in 2019 to $2.9 billion by 2024 as scaling legal research, case development, and strategy refinement becomes increasingly key. Despite evidence showing that only a small number of law firms use AI-based tools — in a recent survey , 7% of firms said that they’ve implemented AI-powered tools, with 45% citing accuracy and cost concerns — interest in the technology continues to grow. “The pandemic in part accelerated the need for legal digital transformation. Last year, we released Advanced Cognitive Search which, [which] helps clients quickly identify force majeure pandemic-related clauses on sell-side and buy-side contracts,” Misra said. “We [also] released Contract Risk & Compliance, which starts to take away not just manual work for a legal team but actually helps them in more strategic work. We launched cognitive language translation, enabling global legal teams to work much more cohesively in their own native languages. And we introduced a Quick Deploy model, which helps get clients up and running with their foundational ContractPodAI functionality such as remote workflow and esignature.” Two-hundred-employee ContractPodAI offers guided forms and templates to create legal applications with integrations with products from IBM, Microsoft, DocuSign, and Salesforce. Customers get a toolkit of AI functionality like document review, cognitive search , and analytics for each use case, as well as “tailored” AI data models tuned to the objective of modules. “When customers upload a contract, the platform’s natural language processing scans the documents, and extracts important aspects like the autorenewal dates, termination dates, and so on,” Misra added. “Further, our Contract Risk & Compliance feature offers suggestions of how to mitigate the risk and track a customer’s progress toward a less risky, more compliant, agreement.” In spite of competition from startups like Lexion , LinkSquares , Malbek , Evisort , and DocuSign , ContractPodAI has managed to attract current and past customers including Bosch Siemens, Braskem, EDF Energy, Total Petroleum, Benjamin Moore, and Freeview. In addition to its office in London, the startup has outposts in San Francisco, New York, Glasgow, Chicago, Sydney, Mumbai, and Toronto. To date, ContractPodAI has raised over $170 million in venture capital. VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings. The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "
3,712
2,021
"Can Intel’s XPU vision guide the industry into an era of heterogeneous computing? | VentureBeat"
"https://venturebeat.com/business/can-intels-xpu-vision-guide-the-industry-into-an-era-of-heterogeneous-computing"
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Can Intel’s XPU vision guide the industry into an era of heterogeneous computing? Share on Facebook Share on X Share on LinkedIn Argonne National Laboratory’s Aurora supercomputer will feature more than 9,000 nodes, each with six Xe-HPCs totaling more than 1 exa/FLOP/s of sustained DP performance. The system uses Intel's Xe-HPC GPU, code-named Ponte Vecchio. This article is part of the Technology Insight series, made possible with funding from Intel. As data sprawls out from the network core to the intelligent edge, increasingly diverse compute resources follow, balancing power, performance, and response time. Historically, graphics processors (GPUs) were the offload target of choice for data processing. Today field programmable gate arrays (FPGAs) , vision processing units (VPUs), and application specific integrated circuits (ASICs) also bring unique strengths to the table. Intel refers to those accelerators (and anything else to which a CPU can send processing tasks) as XPUs. The challenge software developers face is determining which XPU is best for their workload; arriving at an answer often involves lots of trial and error. Faced with a growing list of architecture-specific programming tools to support, Intel spearheaded a standards-based programming model called oneAPI to unify code across XPU types. Simplifying software development for XPUs can’t happen soon enough. After all, the move to heterogeneous computing—processing on the best XPU for a given application—seems inevitable, given evolving use cases and the many devices vying to address them. KEY POINTS Intel sees heterogenous computing (where a host device sends compute tasks to different accelerators) as inevitable. An XPU can be any offload target commanded by the CPU, built on any architecture from any hardware vendor. The oneAPI initiative is an open, standards-based programming model that allows developers to target multiple XPUs with a single code base. Intel’s strategy faces headwind from NVIDIA’s incumbent CUDA platform, which assumes you’re using NVIDIA graphics processors exclusively. That walled garden may not be as impenetrable as it once was. Intel already has a design win with its upcoming Xe-HPC GPU, code-named Ponte Vecchio. The Argonne National Laboratory’s Aurora supercomputer , for example, will feature more than 9,000 nodes, each with six Xe-HPCs totaling more than 1 exa/FLOP/s of sustained DP performance. Time will tell if Intel can deliver on its promise to streamline heterogenous programming with oneAPI, lowering the barrier to entry for hardware vendors and software developers alike. A compelling XPU roadmap certainly gives the industry a reason to look more closely. Heterogeneous computing is the future, but it’s not easy The total volume of data spread between internal data centers, cloud repositories, third-party data centers, and remote locations is expected to increase by more than 42% from 2020 to 2022, according to The Seagate Rethink Data Survey. The value of that information depends on what you do with it, where, and when. Some data can be captured, classified, and stored to drive machine learning breakthroughs. Other applications require a real-time response. The compute resources needed to satisfy those use cases look nothing alike. GPUs optimized for server platforms consume hundreds of watts each, while VPUs in the single-watt range might power smart cameras or computer vision-based AI appliances. In either example, a developer must decide on the best XPU for processing data as efficiently as possible. This isn’t a new phenomenon. Rather, it’s an evolution of a decades-long trend toward heterogeneity, where applications can run control, data, and compute tasks on the hardware architecture best suited to each specific workload. Above: The quest for more performance will make heterogeneous computing a necessity. “Transitioning to heterogeneity is inevitable for the same reasons we went from single core to multicore CPUs,” says James Reinders, an engineer at Intel specializing in parallel computing. “It’s making our computers more capable, and able to solve more problems and do things they couldn’t do in the past — but within the constraints of hardware we can design and build.” As with the adoption of multicore processing, which forced developers to start thinking about their algorithms in terms of parallelism, the biggest obstacle to making computers more heterogenous today is the complexity of programming them. It used to be that developers programmed close to the hardware using low-level languages, providing very little abstraction. The code was often fast and efficient, but not portable. These days, higher-level languages extend compatibility across a broader swathe of hardware while hiding a lot of unnecessary details. Compilers, runtimes, and libraries underneath the code make the hardware do what you want. It makes sense that we’re seeing more specialized architectures enabling new functionality through abstracted languages. oneAPI aims to simplify software development for XPUs Even now, new accelerators require their own software stacks, gobbling up the hardware vendor’s time and money. From there, developers make their own investment into learning new tools so they can determine the best architecture for their application. Instead of spending time rewriting and recompiling code using different libraries and SDKs, imagine an open, cross-architecture model that can be used to migrate between architectures without leaving performance on the table. That’s what Intel is proposing with its oneAPI initiative. Above: The oneAPI Base Toolkit includes everything you need to start writing applications that take advantage of Intel’s CPU and XPU architectures. oneAPI supports high-level languages (Data Parallel C++, or DPC++), a set of APIs and libraries, and a hardware abstraction layer for low-level XPU access. On top of the open specification, Intel has its own suite of toolkits for various development tasks. The Base Toolkit, for example, includes the DPC++ compiler, a handful of libraries, a compatibility tool for migrating NVIDIA CUDA code to DPC++, the optimization oriented VTune profiler, and the Advisor analysis tool, which helps identify the best kernels to offload. Other toolkits home in on more specific segments, such as HPC, AI and machine learning acceleration, IoT, rendering, and deep learning inference. “When we talk about oneAPI at Intel, it’s a pretty simple concept,” says Intel’s Reinders. “I want as much as possible to be the same. It’s not that there’s one API for everything. Rather, if I want to do fast Fourier transforms, I want to learn the interface for an FFT library, then I want to use that same interface for all my XPUs.” Intel isn’t putting its clout behind oneAPI for purely selfless reasons. The company already has a rich portfolio of XPUs that stand to benefit from a unified programming model (in addition to the host processors tasked with commanding them). If each XPU was treated as an island, the industry would end up stuck where it was before oneAPI: with independent software ecosystems, marketing resources, and training for each architecture. By making as much common as possible, developers can spend more time innovating and less time reinventing the wheel. What will it take for the industry to start caring about Intel’s message? An enormous number of FLOP/s, or floating-point operations per second, come from GPUs. NVIDIA’s CUDA is the dominant platform for general purpose GPU computing, and it assumes you’re using NVIDIA hardware. Because CUDA is the incumbent technology, developers are reluctant to change software that already works, even if they’d prefer more hardware choice. Above: Intel’s Xe-HPC GPU employs a brand new architecture, high-bandwidth memory, and advanced packaging technologies to deliver unprecedented performance. If Intel wants the community to look beyond proprietary lock-in, it needs to build a better mousetrap than its competition, and that starts with compelling GPU hardware. At its recent Architecture Day 2021 , Intel disclosed that a pre-production implementation of its Xe-HPC architecture is already producing more than 45 TFLOPS of FP32 throughput, more than 5 TB/s of fabric bandwidth, and more than 2 TB/s of memory bandwidth. At least on paper, that’s higher single-precision performance than NVIDIA’s fastest data center processor. The world of XPUs is more than just GPUs though, which is exhilarating and terrifying, depending on who you ask. Supported by an open, standards-based programming model, a panoply of architectures might enable time-to-market advantages, dramatically lower power consumption, or workload-specific optimizations. But without oneAPI (or something like it), developers are stuck learning new tools for every accelerator, stymying innovation and overwhelming programmers. Above: Fugaku, the world’s fastest supercomputer, uses optimized oneDNN code to maximize the performance of its Arm-based CPUs. Fortunately, we’re seeing signs of life beyond NVIDIA’s closed platform. As an example, the team responsible for RIKEN’s Fugaku supercomputer recently used Intel’s oneAPI Deep Neural Network Library (oneDNN) as a reference to develop its own deep learning process library. Fugaku employs Fujitsu A64FX CPUs, based on Armv8-A with the Scalable Vector Extension (SVE) instruction set, which didn’t have a DL library yet. Optimizing Intel’s code for Armv8-A processors enabled an up to 400x speed-up compared to simply recompiling oneDNN without modification. Incorporating those changes into the library’s main branch makes the team’s gains available to other developers. Intel’s Reinders acknowledges the whole thing sounds a lot like open source. However, the XPU philosophy goes a step further, affecting the way code is written so that it’s ready for different types of accelerators running underneath it. “I’m not worried that this is some type of fad,” he says. “It’s one of the next major steps in computing. It is not a question of whether an idea like oneAPI will happen, but rather when it will happen.” The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat. All rights reserved. "