id
int64 0
17.2k
| year
int64 2k
2.02k
| title
stringlengths 7
208
| url
stringlengths 20
263
| text
stringlengths 852
324k
|
---|---|---|---|---|
2,513 | 2,023 |
"Boomerang brings in-email meeting scheduler to enterprise users | VentureBeat"
|
"https://venturebeat.com/automation/boomerang-brings-in-email-meeting-scheduler-to-enterprise-users"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Boomerang brings in-email meeting scheduler to enterprise users Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Quickly scheduling a meeting is one of those tasks that can be a huge challenge for businesses, large and small, and might benefit from a dedicated, streamlined tool. That’s why Boomerang , the team behind some of the leading individual email productivity tools is stepping into the enterprise meeting scheduling arena.
On Thursday, the company announced the launch of its new in-email meeting scheduler for enterprise users. This interactive graphical calendar appears in the body of emails, and offers a user-friendly and efficient way to manage schedules without forcing the email recipient to leave their mail client.
“This is our patented live image technology. The reason it feels like this is suddenly possible is because nobody has done it before. It’s not an HTML component in the email. We are using a combination of an image map and real-time dynamic image generation that works with our server-side to live update from users’ calendars without leaving the email,” explained Aye Moah, CEO of Boomerang in an email to VentureBeat.
Founded in 2010, Boomerang has focused on building time management tools that automated some of the mundane processes behind effective email marketing. Its software is also SOC2 Type 2 certified and GDPR compliant , so its users proprietary information is secure and privacy maintained.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Automated actions for smoother communications One notable aspect of the new calendar meeting scheduler is its integration with automated components. The Magic Live Calendar automatically updates in real-time, ensuring that the availability of team members remains in sync. This feature is particularly useful for large teams, as it eliminates the need for managers to manually allocate meeting times among team members.
The significance of these productivity tools lies not only in their novel approach but also in their potential for various other functionalities. Boomerang is already working on additional features, though it did not share specifics. However, the company said further enhancements are coming Right now though, the new tool is available for Google Workspace for senders (any email client can receive the invites and use the calendar tool). Support for Outlook sending is slated to be released later this year.
Moreover, Boomerang has a forward-thinking approach toward the implementation of artificial intelligence (AI) in its products. The company has previously launched AI-powered features, such as a tone-checking writing assistant called Respondable. However, in the context of meeting scheduling, Boomerang believes that the solution lies in striking a balance between human interaction and AI automation.
“But when we looked at the inherent issues with meeting scheduling, the power dynamics, who shoulders the burden, we realized that these are fundamentally human problems,” the Boomerang representative said. “So the solution isn’t to remove humanity from the process, it is to make the process itself more human while still taking advantage of technology to do the tedious parts.” Automation needs the human element to be successful While some may envision a future where AI assistants interact with other AI bots to fill calendars automatically, Boomerang’s vision focuses on infusing humanity into productivity tools while optimizing workflow efficiency for all users.
The company’s work to understand users’ needs is evident in its research endeavors. Boomerang has previously conducted studies to identify effective email writing practices, which culminated in the development of the Respondable AI assistant. Building on this success, Boomerang recently published research based on millions of proposed meeting times. The insights gained from this study are set to be incorporated into an upcoming AI assistant specifically tailored for meeting scheduling.
As Boomerang’s graphical calendar meeting scheduler rolls out, enterprises can expect a boost in productivity and smoother collaboration within their teams. By streamlining the meeting scheduling process and keeping calendars up-to-date in real time, teams can focus on their core tasks and objectives. The integration of AI advances in Boomerang’s tools leverages the latest technologies while preserving the essence of human interaction. Through this approach, Boomerang seeks to make work more manageable and meaningful.
“After all, when you’re asking someone to meet with you, you are asking them for the most irreplaceable, most human currency that they have, their time,” said Moah. “So we built our meeting scheduling tools with person-to-person connectivity and respect for the other person’s humanity at their core while still automating away the inefficiencies of the manual process.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,514 | 2,023 |
"Tractian gets $45M to expand AI monitoring of industrial machinery | VentureBeat"
|
"https://venturebeat.com/ai/tractian-gets-45m-to-expand-ai-monitoring-of-industrial-machinery"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Tractian gets $45M to expand AI monitoring of industrial machinery Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Tractian , an industrial asset monitoring company that uses AI to predict mechanical failures, has announced a $45 million series B funding round led by Boston-based venture capital firm General Catalyst and Next47.
The funding will enable Tractian to expand its AI capabilities, grow its research and development (R&D) team and enter new industrial verticals. The round follows a $15 million series A round in 2022 and positions Tractian for further growth.
Founded in 2019 by Igor Marinelli, Tractian uses sensors, edge computing hardware, and AI models to monitor industrial machines and identify potential failures based on vibrations and frequency patterns. The company’s AI analyzes machines’ “fingerprints” to detect specific mechanical issues like wear, imbalance and misalignment.
Tuning in to the right frequency “They all have very specific waves and frequencies that we can identify, no matter if this motor is inside a pulp and paper plant, no matter if it’s an automotive plant. If there’s a motor of that manufacturer, of that OEM, it’s going to have that specific frequency,” said Marinelli in a phone call with VentureBeat. Marinelli, whose background is in industrial maintenance, says he founded Tractian to help companies eliminate downtime, anticipate failures and extend the lifetime of their assets.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Tractian manufactures its own sensors and hardware to ensure high uptime and reliability in harsh industrial environments. The company currently has over 500 customers, representing roughly 1,000 manufacturing plants across industries like food and beverage, automotive, oil and gas, and facilities management.
“We have our own factory, we manufacture our own hardware, we’re 100% verticalized. We have the patents on the hardware, the patents in the models,” said Marinelli. The company estimates its technology currently impacts around 5% of global industrial GDP, based on its customer base.
Marinelli says that it’s relatively straightforward to evaluate a prospective maintenance solution, “whether this is going to add top-line revenue or reduce some of the costs.” “Our average savings … per machine is $6,000 a year,” he told VentureBeat.
R&D, expansion and AI training will benefit from new funding With the new funding, Tractian plans to expand into different verticals and continue refining its AI models as well as continue its strong focus on R&D. The company currently has nearly 200 R&D engineers focused on data science , data engineering, hardware engineering and firmware development.
Tractian’s AI models are tailored specifically to different machine types and industrial verticals. The company says industry-specific AI is key to achieving high accuracy in failure prediction. With 3,000 models already deployed and able to detect different types of failures, the Tractian platform will continue to adapt to new systems using feedback from its users.
“The more that you add data, the human feedback loops happen” and the better your system gets over time, said Marinelli. “And the more competitive advantage you have,” he added, “because you have a massive database of failure that’s been cataloged and labeled by the user.” While Tractian leverages mobile networks for connectivity, the company does not rely on WiFi to ensure uptime. Tractian embeds its own connectivity in sensors that automatically select the best available carrier. Tractian says one of the main challenges with its AI is getting accurate feedback from customers to improve models. Sometimes customers ignore failures flagged by the AI when the failure can be worked around.
“We want to be world-class at manufacturing and we can’t do so without a predictive maintenance solution. We tried to build this internally but were left with a mess of data and no easy way to take actions on it,” said Luis Moncada, the Maintenance Manager at Johnson Controls in a release on Monday. “I believe the future is highly personalized with AI at the center — I want my alerts to be based on my utilization, my equipment and my way of stressing my machine. Tractian is simply more agile at applying new technologies and they are far more collaborative in coming up with new ideas and implementing them.” The series B funding marks an important step in Tractian’s mission to optimize asset uptime for industrial companies around the world. The new capital will enable Tractian to significantly expand its AI-based asset monitoring solutions and help more organizations reduce downtime and maintenance costs.
Marinelli said he draws on his own experience of his father working in the maintenance business. “He started his career as a maintenance technician, and then maintenance supervisor, then maintenance coordinator,” said Marinelli. “I lived this from my childhood. And when I went to work in industry to see it myself, I got passionate about this challenge.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,515 | 2,023 |
"OpenAI promotes GPT-4 as a way to reduce burden on human content moderators | VentureBeat"
|
"https://venturebeat.com/ai/openai-promotes-gpt-4-as-a-way-to-reduce-burden-on-human-content-moderators"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages OpenAI promotes GPT-4 as a way to reduce burden on human content moderators Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
One of the most unsung jobs of the internet era is that of the content moderator.
Casey Newton , Adrien Chen and others have previously reported eloquently and harrowingly on the plight of these laborers, who number in the thousands and are tasked by large social networks such as Facebook with reviewing troves of user-generated content for violations and removing it from said platforms.
The content they are exposed to often includes detailed descriptions and photographic or video evidence of humanity at its worst — such as depictions of child sexual abuse — not to mention various other crimes, atrocities and horrors.
Moderators charged with identifying and removing this content have reported struggling with post-traumatic stress disorder (PTSD), anxiety and various other mental illnesses and psychological maladies due to their exposure.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! AI shouldering content moderation Wouldn’t it be an improvement of an artificial intelligence (AI) program could shoulder some, or potentially even most, of the load of online content moderation? That’s the hope of OpenAI, which today published a blog post detailing its findings that GPT-4 — its latest publicly available large language model (LLM) that forms the backbone of one version of ChatGPT — can be used effectively to moderate content for other companies and organizations.
“We believe this offers a more positive vision of the future of digital platforms, where AI can help moderate online traffic according to platform-specific policy and relieve the mental burden of a large number of human moderators,” write OpenAI authors Lilian Weng View, Vik Goel and Andrea Vallone.
In fact, according to OpenAI’s research, GPT-4 trained for content moderation performs better than human moderators with minimal training, although both are still outperformed by highly trained and experienced human mods.
How GPT-4’s content moderation works OpenAI outlines a 3-step framework for training its LLMs, including ChatGPT 4, to moderate content according to a hypothetical organization’s given policies.
The first step in the process includes drafting the content policy — presumably this is done by humans, although OpenAI’s blog post does not specify this — then identifying a “golden set” of data that human moderators will label. This data could include content that is obviously in violation of policies or content that is more ambiguous, but still ultimately deemed by human moderators to be in violation. It might also include examples of data that is clearly in-line with the policies.
Whatever the golden data set, the labels will be used to compare the performance of an AI model. Step two is taking the model, in this case GPT-4, and prompting it to read the content policy and then review the same “golden” dataset, and assign it its own labels.
Finally, a human supervisor would compare GPT-4’s labeling to those originally created by humans. If there are discrepancies, or examples of content that GPT-4 “got wrong” or labeled incorrectly, the human supervisors(s) could then ask GPT-4 to explain its reasoning for the label. Once the model describes its reasoning, the human may see a way to rewrite or clarify the original content policy to ensure GPT-4 reads it and follows this instruction going forward.
“This iterative process yields refined content policies that are translated into classifiers, enabling the deployment of the policy and content moderation at scale,” write the OpenAI authors.
The OpenAI blog post also goes on to describe how this approach excels over “traditional approaches to content moderation,” namely, by creating “more consistent labels” compared to an army of human moderators who may be interpreting content differently according to the same policy, a “faster feedback loop” for updating content policies to account for new violations, and, of course, a “reduced mental burden” on human content moderators, who might presumably be called in only to help train the LLM or diagnose issues with it, and leave all of the front-line and bulk of the moderation work to it.
Calling out Anthropic OpenAI’s blog post and promotion of content moderation as a good use case for its signature LLMs makes sense especially alongside its recent investment and partnership with media organizations including The Associated Press and the American Journalism Project.
Media organizations have long struggled with effectively moderating reader comments on articles, while still allowing for freedom of speech, discussion and debate.
Interestingly, OpenAI’s blog post also took the time to call out the “Constitutional AI” framework espoused by rival Anthropic for its Claude and Claude 2 LLMs, in which an AI is trained to follow a single human-derived ethical framework in all of its responses.
“Different from Constitutional AI (Bai, et al. 2022) which mainly relies on the model’s own internalized judgment of what is safe vs. not, our approach makes platform-specific content policy iteration much faster and less effortful,” write the Open AI authors. “We encourage trust and safety practitioners to try out this process for content moderation, as anyone with OpenAI API access can implement the same experiments today.” The dig comes just one day after Anthropic, arguably the leading proponent of Constitutional AI, received a $100 million investment to create a telecom-specific LLM.
A noteworthy irony There is of course a noteworthy irony to OpenAI’s promotion of GPT-4 as a way to ease the mental burden of human content moderators: according to detailed investigative reports published in Time magazine and The Wall Street Journal , OpenAI itself employed human content moderators in Kenya through contractors and subcontractors such as Sama, to read content, including AI-generated content, and label it according to the severity of the severity of the harms described.
As Time reported, these human laborers were paid less than $2 (USD) per hour for their work, and both reports indicate that workers experienced lasting trauma and mental illness from it.
“One Sama worker tasked with reading and labeling text for OpenAI told Time he suffered from recurring visions after reading a graphic description of a man having sex with a dog in the presence of a young child,” the Time article states.
Workers recently petitioned the government of Kenya to enact new laws that would further protect and provide for content moderators.
Perhaps then, OpenAI’s automated content moderation push is in some sense, a way of making amends or preventing future harms like the ones that were involved in its creation.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,516 | 2,023 |
"Consulting giant McKinsey unveils its own generative AI tool for employees: Lilli | VentureBeat"
|
"https://venturebeat.com/ai/consulting-giant-mckinsey-unveils-its-own-generative-ai-tool-for-employees-lilli"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Consulting giant McKinsey unveils its own generative AI tool for employees: Lilli Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
McKinsey and Company, the nearly century-old firm that is the one of the largest consulting agencies in the world , made headlines earlier this year with its rapid embrace of generative AI tools, saying in June that nearly half of its 30,000 employees were using the technology.
Now, the company is debuting a gen AI tool of its own: Lilli , a new chat application for employees designed by McKinsey’s “ ClienTech ” team under chief technology officer (CTO) Jacky Wright. The tool serves up information, insights, data, plans, and even recommends the most applicable internal experts for consulting projects, all based on more than 100,000 documents and interview transcripts.
“If you could ask the totality of McKinsey’s knowledge a question, and [an AI] could answer back, what would that do for the company? That’s exactly what Lilli is,” McKinsey senior partner Erik Roth, who led the product’s development, said in a video interview with VentureBeat.
Named after Lillian Dombrowski, the first woman McKinsey hired for a professional services role back in 1945, Lilli has been in beta since June 2023 and will be rolling out across McKinsey this fall.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Roth and his collaborators at McKinsey told VentureBeat that Lilli has already been in use by approximately 7,000 employees as a “minimum viable product” (MVP) and has already cut down the time spent on research and planning work from weeks to hours, and in other cases, hours to minutes.
“In just the last two weeks, Lilli has answered 50,000 questions,” said Roth. “Sixty six percent of users are returning to it multiple times per week.” How McKinsey’s Lilli AI works Roth provided VentureBeat with an exclusive demo of Lilli, showing the interface and several examples of the responses it generates.
The interface will look familiar to those who have used other public-facing text-to-text based gen AI tools such as OpenAI’s ChatGPT and Anthropic’s Claude 2.
Lilli contains a text entry box for the user to enter in questions, searches and prompts at the bottom of its primary window, and generates its responses above in a chronological chat, showing the user’s prompts and Lilli’s responses following.
However, the are several features that immediately stand out in terms of additional utility: Lilli also contains an expandable left-hand sidebar with saved prompts, which the user can copy and paste over and modify to their liking. Roth said that categories for these prompts were coming soon to the platform, as well.
Gen AI chat and client capabilities functions The interface includes two tabs that a user may toggle between, one, “GenAI Chat” that sources data from a more generalized large language model (LLM) backend, and another, “Client Capabilities” that sources responses from McKinsey’s corpus of 100,000-plus documents, transcripts and presentations.
“We intentionally created both experiences to learn about and compare what we have internally with what is publicly available,” Roth told VentureBeat in an email.
Another differentiator is in sourcing: While many LLMs don’t specifically cite or link to sources upon which they draw their responses — Microsoft Bing Chat powered by OpenAI GPT-4 being a notable exception — Lilli provides a whole separate “Sources” section below every single response, along with links and even page numbers to specific pages from which the model drew its response.
“We go full attribution,” said Roth. “Clients I’ve spoken with get very excited about that.” What McKinsey’s Lilli can be used for With so much information available to it, what kinds of tasks is McKinsey’s new Lilli AI best suited to complete? Roth said he envisioned that McKinsey consultants would use Lilli through nearly every step of their work with a client, from gathering initial research on the client’s sector and competitors or comparable firms, to drafting plans for how the client could implement specific projects.
VentureBeat’s demo of Lilli showed off such versatility: Lilli was able to provide a list of internal McKinsey experts qualified to speak about a large e-commerce retailer, as well as an outlook for clean energy in the U.S. over the next decade, and a plan for building a new energy plant over the course of 10 weeks.
Throughout it all, the AI cited its sources clearly at the bottom.
While the responses were sometimes a few seconds slower than leading commercial LLMs, Roth said McKinsey was continually updating the speed and also prioritized quality of information over rapidity.
Furthermore, Roth said that the company is experimenting with enabling a feature for uploading client information and documentation for secure, private analysis on McKinsey servers, but said that this feature was still being developed and would not be deployed until it was perfected.
“Lilli has the capacity to upload client data in a very safe and secure way,” Roth explained. “We can think about use cases in the future where we’ll combine our data with our clients data, or just use our clients’ data on the same platform for greater synthesis and exploration…anything that we load into Lilli, goes through an extensive compliance risk assessment, including our own data.” The technology under the hood Lilli leverages currently available LLMs, including those developed by McKinsey partner Cohere as well as OpenAI on the Microsoft Azure platform, to inform its GenAI Chat and natural language processing (NLP) capabilities.
The application, however, was built by McKinsey and acts as a secure layer that goes between the user and the underlying data.
“We think of Lilli as its own stack,” said Roth. “So its own layer sits in between the corpus and the LLMs. It does have deep learning capabilities, it does have trainable modules, but it’s a combination of technologies that comes together to create the stack.” Roth emphasized that McKinsey was “LLM agnostic” and was constantly exploring new LLMs and AI models to see which offered the most utility, including older versions that are still being maintained.
While the company looks to expand its usage to all employees, Roth also said that McKinsey was not ruling out white-labeling Lilli or turning it into an external-facing product for use by McKinsey clients or other firms entirely.
“At the moment, all discussions are in play,” said Roth. “I personally believe that every organization needs a version of Lilli.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,517 | 2,023 |
"Amazon partners with UVeye on AI inspections of delivery vans | VentureBeat"
|
"https://venturebeat.com/ai/amazon-partners-with-israeli-startup-uveye-on-ai-inspections-of-delivery-vans"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amazon partners with Israeli startup UVeye on AI inspections of delivery vans Share on Facebook Share on X Share on LinkedIn Credit: Amazon/UVEye Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Amazon has built an impressive delivery fleet spanning more than 100,000 vehicles, including 10,000 new electric vans from automaker Rivian.
But as of today, the company is bringing even more technology to bear: Amazon announced it is partnering with UVeye , an Israeli startup, to automate inspections of its delivery vehicles using a new AI system originally developed to detect car bombs.
The partnership will encompass “hundreds of Amazon warehouses” in the U.S., Canada, Germany, and the U.K., in which the companies will install UVeye’s automated, AI-powered vehicle scanning system, also known as the Automated Vehicle Inspection (AVI).
The companies have already rolled out and tested AVI at “select Amazon delivery stations in the U.S.” according to UVeye’s news announcement on its website.
The companies say AVI saves time and improves safety, detecting issues like nails in tires and other wear and tear and damage to the vehicles.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “We can automate most of the inspection process at scale,” said Tom Chempananical, Global Fleet Director at Amazon Logistics, in a statement on UVEye’s website. “This reduces the time spent on inspections by DSPs and Delivery Associates, ensuring packages reach customers faster while improving road safety.” Fleet owners are also cheering the move.
“The last thing I want is for something preventable to happen—like a tire blowing out because we missed an imperceptible defect during our morning inspection,” said Bennett Hart, an Amazon Delivery Service Partner (DSP) who owns the logistics company Hart Road, in a comment on Amazon’s website.
“This technology improves the safety of our fleet.” How the AVI system works Amazon’s announcement post compared the UVEye AVI scanning system to when patients get scanned by MRI or CAT machines at their doctor.
Of course, the average motor vehicle is much larger than the average person, so UVeye has built its own 17-foot-tall archway filled with sensors that the vehicles drive underneath at a speed of 5 miles-per-hour. This device is known as Atlas and performs a 360-degree scan of the vehicle’s exterior.
In addition, UVeye is working with Amazon to provide its original Helios underbody scanner installed on the floor, which has cameras pointed upward to capture the undercarriage.
“The AI system provides a full-vehicle scan in a few seconds,” Amazon’s post claims.
UVeye’s software uses scanned images of the vehicle from different vantage point and stitches them together into a 3D model, which the companies claim can find “hidden damage patterns” and issues that human inspectors would miss, such as “sidewall tears” in tires.
Human inspectors can then view the 3D model and zoom in on different parts flagged by the AI system to inspect them virtually before going out to the vehicle to fix them.
UVeye’s unique origin story UVeye, founded in 2016 when two brothers Amir and Ohad Hever, were driving into a secure facility in Israel and had their vehicle inspected by a guard using a mirror to look at the undercarriage.
As Amir told the publication Unite.AI a few months ago, the brothers immediately “understood there must be a better way to scan for bombs and other security threats that might be hiding under vehicles. It took us a few months to put together an underbody scanner that vehicles drive over and – using computer vision and deep learning algorithms – could detect any modification to the undercarriage and flag anything that shouldn’t be under a car.” After earning positive coverage from VentureBeat at CES 2019 , UVeye weathered the global disruption of the COVID-19 pandemic, and now has facilities in New Jersey and Ohio as well. In addition to Amazon, the company has previously partnered with GM, Carmax, Hyundai, Volvo, and Toyota’s Tsusho division.
And, in addition to offering exterior scans, the company also offers a system called Apollo that can provide 360 scans of car interiors and record engine noises from a smartphone, designed for used car dealers.
The news of the Amazon partnership also provides a bright spot amid an extremely difficult time for Israeli AI and tech startups , many of whom have had employees called up by the military reserves in Israel’s ongoing fight with the terrorist group Hamas, which flared up following a major attack on Israeli civilians on October 7.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,518 | 2,023 |
"Rasgo launches Rasgo AI, a generative AI agent for enterprise data warehouse analytics | VentureBeat"
|
"https://venturebeat.com/enterprise-analytics/rasgo-launches-rasgo-ai-generative-ai-agent-enterprise-data-warehouse-analytics"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Rasgo launches Rasgo AI, a generative AI agent for enterprise data warehouse analytics Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
AI-driven analytics platform Rasgo has announced the launch of Rasgo AI , a self-service analytics solution that integrates a GPT (generative pre-trained transformer) into enterprise data warehouse (EDW) environments. The company said that with Rasgo AI, organizations can use the power of AI/GPT to accelerate insights and optimize recommended actions securely and efficiently.
Unlike other GPT integrations that provide only natural language chat interfaces, Rasgo said its AI stands out by employing GPT for “intelligent reasoning,” which enables it to think and act like a knowledgeable business analyst for data warehouses.
Knowledge workers often get bogged down by time-consuming, low-value tasks that hinder effective decision-making. By offloading these tasks to AI, Rasgo aims to enable these workers to focus on strategic decision-making, leading to significant gains in enterprise value.
Answering questions — and asking them Rasgo asserts that GPT-4 enables the model to adeptly perform intricate reasoning tasks with dynamic objectives. The autonomous agent becomes capable of generating a semantic embedding of the EDW metadata, thereby educating GPT-4 about the data while retaining control of the data within the secure environment of the enterprise.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “One of our most exciting technical findings was that when provided with the right guidance, GPT-4 is not only good at answering data analysis questions but also good at asking them. Rasgo provides a metadata repository about the data to teach the AI how to make specific decisions when analyzing data so that it can iteratively improve and learn from human calibration,” Jared Parker, cofounder and CEO of Rasgo, told VentureBeat. “By combining the chat interface with our solution for intelligent reasoning, we aim to improve … operational efficiencies of [customers’] key stakeholders while also trusting that AI is constantly analyzing data to derive key insights.” According to the company, one of Rasgo AI’s key differentiators is AI Guardrails, which map data structures into familiar business terms, enhancing the efficiency and accuracy of data analysis while ensuring data security. The platform also analyzes enterprise data continuously to provide trusted insights, enabling business users to make data-driven decisions without needing advanced SQL skills.
Leveraging GPT-4 for intelligent business reasoning Parker stated that for intelligent reasoning, the platform trains GPT to replicate a data analyst’s role. This equips enterprise data teams to accelerate analysis, as opposed to building queries and dashboards from the ground up.
“We acknowledged the potential time constraints faced by humans in formulating all necessary inquiries. Our intelligent reasoning establishes an ‘always-on’ virtual team of knowledge workers, consistently identifying business prospects and vulnerabilities,” said Parker. “A simple prompt like ‘analyze trends in year-over-year sales growth by sales rep’ can yield a comprehensive presentation of pivotal insights and actionable steps.” For human-AI collaboration, Rasgo said that its platform aids data teams by autonomously assessing tables in the data warehouse and discerning which tables are primed for intelligent reasoning and which need further refinement.
This approach, according to the company, enables human stakeholders to channel their energies toward transforming and documenting tables that require additional manual attention to ensure governance and trust in an AI-based workflow.
The company also highlighted that its generative AI model can mechanize numerous routine, low-value tasks inherent in the data analysis lifecycle. This automation aims to guide users through the process of data discovery and analysis, all the while maintaining the oversight of a data analyst. The platform’s ultimate goal is to optimize accuracy and instill trust in organizational processes.
“The conventional data analysis process is broken. Answering a single data-driven question can take an exorbitant amount of time, involving the identification of relevant tables, writing and debugging SQL queries, creating dashboards in BI tools, and translating results into comprehensible business recommendations,” Parker explained. “Our AI engine proactively searches metadata and query history to suggest the gold-standard table; writes, tests and executes the necessary SQL query; generates the appropriate visualization; and distills the results into actionable business recommendations. Throughout this process, we ensure the data analyst remains involved, enabling human decision-making at critical junctures to optimize for accuracy and trust.” Parker said that for Rasgo’s AI to navigate a database, the generative AI model crafts embeddings for all data warehouse metadata and user-provided instructional data. This ensures swift retrieval within what the company calls its ReAct (reason + act) AI workflow. Additionally, it autonomously maintains and refreshes these embeddings whenever new tables emerge, schemas evolve, or fresh user instructions are incorporated.
Ensuring responsible AI development Parker asserted that responsible functioning of generative AI and achieving desired outcomes from the technology hinge on collaborative efforts between humans and AI. This entails setting explicit rules, instructions and guardrails to ensure trust and safety, particularly in the context of enterprise data.
He explained that to counter the risks of hallucination and data disparities, the company has formulated an “AI Manager” capability. This suite of tools empowers users to establish definitive guardrails and constraints on the large language model (LLM) , ensuring its selection of the gold-standard table, column and metric when addressing user prompts.
The platform’s AI automates the documentation of table metadata sourced from the data warehouse environment. Simultaneously, it assigns an “AI Readiness” score to each table. This score aids data teams in distinguishing datasets primed for secure AI applications from those requiring further human intervention.
The company has built its solution around Microsoft as Rasgo’s AI API provider, integrating directly with Microsoft’s security framework.
Democratizing trusted intelligence “LLMs like GPT are widely used for text-to-SQL translation. However, incorrect SQL can lead to flawed decisions based on inaccurate data. Our platform democratizes trusted intelligence by teaching GPT about a user’s schemas and teaching it to respect user-provided instructional data so that it can be instantly retrieved to produce accurate SQL and trusted insights,” Parker told VentureBeat. ”In terms of data privacy and security , we have implemented “push down compute” capabilities. This means that the SQL generated by the LLM is sent directly to the organization’s cloud data warehouse environment, ensuring no raw data leaves their warehouse.” The company recently announced its collaboration with Snowflake’s Partner Network, aiming to enhance the benefits of the Snowflake Data Cloud for mutual customers. Through this partnership, Rasgo says it is able to harness GPT for intelligent reasoning, streamlining self-service analytics.
“Going forward, we plan to continue the momentum with this partnership and others similar, by enhancing the accessibilities it provides to customers and overall making sure the product itself can meet the needs of all organizations at all stages of their data analytics and AI journeys,” said Parker.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,519 | 2,023 |
"Ramp introduces Ramp Plus and bags Shopify | VentureBeat"
|
"https://venturebeat.com/enterprise-analytics/fast-growing-expense-and-credit-card-startup-ramp-introduces-ramp-plus-bags-shopify"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Fast-growing expense and credit card startup Ramp introduces Ramp Plus, bags Shopify Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Ah, the 1960s — the decade of counterculture, free love, Mad Men and the first corporate credit card.
Ever since then, as more businesses have adopted corporate cards and sought to manage employee expenses, the number of tools and services to assist them has proliferated.
But today, nearly 60 years later, one of the newest — the corporate card and expense software startup Ramp — is introducing the latest evolution of business expense tracking and management technology for companies with complex finances: Ramp Plus.
In addition, the company is announcing that it has secured the exclusive expense management business of none other than Shopify, the Canadian e-commerce platform giant that has been on a tough road of late, retreating from its costly logistics business.
With Ramp, the hope is that Shopify can reign in costs even further.
“We are the most comprehensive financial operations solution available in the market,” said Ramp cofounder and CEO Eric Glyman in an email to VentureBeat. “Our unified platform offers customers a corporate card and seamless expense management, bill payments, vendor management and price intelligence, working capital support, and now a fully automated procure-to-pay solution.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! What Ramp Plus offers Ramp Plus is a new software-as-a-service (SaaS) offering available to Ramp’s business customers beginning in September, which builds on Ramp’s existing expense management and tracking solutions. Ramp is granting complimentary access of Ramp Plus to its small-to-medium businesses (SMBs) and mid-market customers for one year as a token of gratitude for their loyalty.
Ramp Plus is “purpose-built for businesses with complex financial needs,” according to Ramp, and its features include: An automated and customizable procurement solution to keep tabs on “shadow spend” as businesses scale.
Consolidated global spend management for domestic and international entities, facilitating effortless scaling and global operations.
User-friendly custom workflows to automate complex financial processes swiftly and without any need for coding.
Advanced roles, permissions and policy enforcement to prevent overspending.
Seamless integration with existing business systems, from HRIS to ERP and data warehousing, supported by flexible integrations and industry-leading APIs.
What is “shadow spend?” And why is Ramp focused on helping businesses and CFOs/accounting departments reduce it? Shadow spend is a term used to describe unauthorized expenses made by employees of a company and charged to the business.
“Gaining visibility into employee purchases (aka ‘shadow’ or ‘indirect’ spend) is a top priority for CFOs, and the varied nature of these expenses make it hard to quantify,” Glyman told VentureBeat. “We do know that cost control is top of mind for CFOs, with vendor/supplier costs and tech investments as the top two expense areas to manage.” While there are few reliable statistics for how much shadow spend costs enterprises on average or in total, Ramp believes it is a significant enough amount to warrant developing new features to track and manage, namely, the new Ramp Plus automated procurement solution.
Using Ramp Plus, “employees have a centralized place to request spend and maintain visibility throughout the buying process,” Glyman explained to VentureBeat via email. He added that it “gives finance teams the tighter purchasing controls needed to get more spend under management and prevent out-of-policy spend before it even happens.” Specifically, using Ramp Plus, employee spending requests “are routed through highly configurable approval workflows, and upon approval, generate a virtual card for the employee to go buy what they need, or a purchase order for finance teams to track and match to invoices in Ramp’s Accounts Payable tab.” Moreover, recognizing the global footprint of both its own operations and those of its customers, Ramp says Ramp Plus supports “international debiting and the ability to manage multiple entities on Ramp.
Ultimately, the goal of Ramp Plus is to make it much easier for businesses to be able to track and manage employee spending without getting in the way of their actual legitimate business purchases.
“For most businesses without a procurement team, employee purchases are disjointedly tracked and managed through a combination of intake forms and email chains,” Glyman told VentureBeat. “As companies scale, so does the volume of unmanaged employee purchases dispersed across teams, leaving it to finance departments to establish an efficient buying process that gives complete visibility on upcoming purchases and clear policy controls to optimize costs without slowing down the business to buy what they need.” Ramp Plus seeks to do away with all of that — or at least, to streamline it all so that neither businesses nor their employees are held up by accounting issues.
Why Shopify chose Ramp as its exclusive business expense management provider Ramp was founded in 2019 by a trio of co-founders including Glyman (CEO), Karim Atiyeh (CTO), and Gene Lee (CPO). It has since grown quickly into a leading enterprise technology vendor with 15,000 businesses as customers across 70 countries, hundreds of thousands of individual cardholders, and saved its customers $600 million and more than 8.5 million hours of expense tracking and processing, according to the company.
Among those customers are Sierra Nevada Brewing Company, Waymo, Classpass, Glossier, Poshmark, Eventbrite and Virgin Voyages. At Ramp’s last disclosed valuation in early 2022, the corporate card/expense management software startup was worth more than $8 billion.
In an endorsement of Ramp’s unique financial solutions, Phil Whitham, Director and International Controller at Shopify, said, “Our needs are incredibly complex. We tried to build a platform ourselves but found Ramp to be the perfect fit with the features we needed, now and in the future. Ramp has shown the commitment we need from a long-term partner, supporting our decade-long hypergrowth.” Hypergrowth could also describe Ramp’s own trajectory, with the company reporting 100% growth in global customers and 83% growth in enterprise customers in the last six months alone. Now with Ramp Plus and Shopify under its belt, the startup is poised to continue its expansion, challenging corporate card stalwarts like American Express and SAP Concur.
“We also are the most comprehensive financial operations solution available in the market,” Glyman told VentureBeat. “Ramp is the only company aligning our bottom line with our customers spending less. That’s why we’re seeing industry-leading growth, with the majority of Ramp’s enterprise and midmarket customers in the last 6 months coming from AmEx, Bill, Concur, or Expensify.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,520 | 2,023 |
"Clari acquires Groove to build a comprehensive revenue platform with AI | VentureBeat"
|
"https://venturebeat.com/enterprise-analytics/clari-acquires-groove-build-comprehensive-revenue-platform-ai"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Clari acquires Groove to build a comprehensive revenue platform with AI Share on Facebook Share on X Share on LinkedIn Image Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Clari , a leading revenue operations platform, announced today that it has acquired Groove , a leading sales engagement platform, for an undisclosed amount. The deal will enable Clari to offer the first end-to-end revenue platform that executes all internal and external revenue workflows, from prospecting to closing to renewing.
“This is the most transformative day of our history,” Andy Byrne, CEO of Clari, told VentureBeat in an exclusive interview. “It’s a big acquisition for us.” “When we started this company, our thesis was grounded in the belief that AI would revolutionize how businesses manage revenue. More specifically, our vision was to assist CEOs in answering the critical question of whether they would meet or miss revenue targets,” he explained. “We aimed to offer a predictive solution to address the common issue of revenue leaks that many companies face. Our goal was to help them achieve what we call revenue precision.” Optimizing sales and revenue operations The acquisition comes at a time when businesses are increasingly looking for ways to optimize their sales and revenue operations. According to Seth Marrs, principal analyst at Forrester, “This acquisition will create a sense of urgency in the market that leads all key players to step up their focus to create their own version of a revenue orchestration platform. It’s a significant win for all companies looking to improve sales performance.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The timing of the acquisition coincides with shifting market conditions and buyer demands, explained Dan Gottlieb, senior director analyst at Gartner. As growth models become challenging, revenue teams want to simplify tech stacks while still leveraging AI.
“B2B sales tech buyers want to take advantage of AI for sales and simultaneously reduce complexity in their tech stack,” Gottlieb said. “Clari with Groove now has the pieces to deliver a complete revenue hub with interconnected workflows and deep data integrations.” The move also responds to a significant shift in the industry, wherein growth-focused companies are struggling to secure funding to transition to profit-focused models. Marrs added, “Consolidating to create a more comprehensive platform is the best option in this environment. Those that don’t, face the less appealing prospect of a down round or going out of business.” Rapid shift in market conditions With the acquisition of Groove, Clari will gain an advantage over its rivals in the sales technology market, such as Gong, SalesLoft, Outreach and ZoomInfo. Clari will be able to provide a broader set of capabilities across various sales technology categories, such as revenue intelligence, conversation intelligence, sales engagement and data workflow.
Clari will also stand out with its generative AI technology. Clari uses generative AI to correlate and attribute different signals from external and internal workflows, and to help revenue teams reduce their time to revenue.
Founded in 2012, Clari is one of the fastest-growing companies in the sales tech market, with a valuation of more than $2.6 billion and a total funding of $375 million in the last 12 months. The company has powered many successful IPOs in recent years, such as Confluent, UiPath, Procore and Hashicorp. Clari’s revenue operations platform is used by more than 1,000 companies across different industries, such as Adobe, Dell Technologies, Dropbox, Qualtrics, Splunk, Twilio, Workday and Zoom.
Clari plans to integrate Groove’s data and capabilities into its platform in six to nine months. Clari will be able to use Groove’s sales engagement features to execute personalized and scalable outreach campaigns, while Groove will be able to benefit from Clari’s revenue intelligence features to forecast more accurately, identify risks and opportunities, and optimize sales performance. This will create a powerful and efficient solution for revenue teams to run their entire revenue process in one unified system.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,521 | 2,023 |
"Vibes announces new AI engine for mobile marketing campaigns | VentureBeat"
|
"https://venturebeat.com/ai/vibes-announces-new-ai-engine-for-mobile-marketing-campaigns"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Vibes announces new AI engine for mobile marketing campaigns Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Vibes , a 25-year-old marketing tech company founded in Chicago that was an early leader in using mobile messaging to reach and engage customers on behalf of major global brands, is now expanding its signature web-based campaign planning tool, the Vibes Platform, with artificial intelligence (AI) models trained on its vast trove of data.
The Vibes Platform has been upgraded with the Nexus engine, a machine learning (ML) model developed by Vibes.
It provides marketers with insights and recommendations on the best types of mobile marketing messages to send to different demographics of users, the best times to send, the right frequency to avoid spamming consumers, and the best promotions to convert their attention into actions — and ultimately, dollars — for leading brands.
The Nexus engine is “the next evolutionary version of our platform,” said Vibes vice president of product management Joseph Catrambone III in a video call interview with VentureBeat.
Among Vibes’ customers are such diverse yet recognizable brands as Chipotle, The Container Store and Polo Ralph Lauren.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Using the Vibes Platform with Nexus, “we can streamline their activities, the things that are really the most critical for them, instead of them having to come in and hunt and peck for things,” Catrambone said.
Achieving better marketing results with lower effort Catrambone likened the new Vibes with Nexus to the homescreen of your favorite video streaming service — where video titles are presented not in alphabetical order or even necessarily by broad genre as they were in the physical video rental stores of yore. Instead, they are micro-targeted to your viewing preferences and prior viewing habits.
“We’re doing the same thing for our customers,” Catrambone told VentureBeat. “Instead of us expecting them to come in [to our platform] and understand how to execute the most perfect timing and most perfect media with the most perfect content, we want to bring all that to them.” In the same fashion, Vibes with Nexus provides brand marketers with recommended marketing campaign messages, assets and ideas tailored to fit the demographics of the users who have signed up to receive their marketing messages — and the marketers’ campaign goals. This tailoring is based both on previous end-user engagement data for different audience segments, and on the general trends observed on the platform.
One of the biggest new trends to emerge in mobile marketing in recent years is the use of “mobile wallets” — not only the system defaults such as Apple Wallet or Google Pay, but the brand-specific digital mobile loyalty cards consumers can pay with or collect points on, motivating them to continue engaging and spending time and money with the brand. The Vibes Platform already supported this feature, but now has ML-powered insights to bring to it to help choose messaging language and visual assets that are most effective with end users.
As for end users who are least engaged with a brand’s mobile messaging, Vibes with Nexus will suggest new strategies and tactics for reaching them.
In fact, as a brand marketer goes through the Vibes Platform to create new messages and campaigns, the Nexus engine will suggest improvements based on what the marketer inputs.
This not only saves marketers time, but empowers them to achieve better results, driving up the return on investment — every time they send a mobile message, marketers pay a small amount, which adds up quickly at scale. Brands using Vibes see 25% less attrition in their SMS subscriber base compared to those using rival platforms, according to data provided by the company.
The ML models that Vibes uses are proprietary, but the company has not ruled out integrating with outside models such as consumer generative AI leader OpenAI’s GPT-3.5 or GPT-4.
Early mover advantage One of Vibes’ biggest assets came about from its founding: The company was created by childhood friends Jack Philbin and Alex Campbell, who foresaw the potential of SMS as it emerged in North America. From a humble beginning in their apartment, they established strong ties with over 65 major mobile carriers, including Verizon and AT&T.
Because of Vibes’ initial and deep longstanding integration with mobile carriers, the company can determine the times of “peak” mobile messaging traffic across them, and determine the best times to send messages to avoid delays and improve engagement.
When it comes to privacy and security, the company says it is compliant with the GDPR and with all rules and regulations in the jurisdictions in which it operates. It sends messages only to those who have opted in to a marketer’s promotions.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,522 | 2,023 |
"Tromzo secures $8M to lead the charge in AI-powered application security posture management | VentureBeat"
|
"https://venturebeat.com/ai/tromzo-secures-8m-to-lead-the-charge-in-ai-powered-cloud-security-solutions"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Tromzo secures $8M to lead the charge in AI-powered application security posture management Share on Facebook Share on X Share on LinkedIn Image Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Tromzo , a Mountain View, Calif.-based cybersecurity startup, today announced an additional $8 million in an oversubscribed seed round led by Venture Guides, with new investors Alumni Ventures, Uncorrelated Ventures and strong participation from existing investors.
The company previously announced funding from Innovation Endeavors and more than 25 leading CISOs, including Caleb Sima (Robinhood), Adam Glick (SimpliSafe) and Steve Pugh (ICE/NYSE), who participated through Silicon Valley CISO Investments.
Tromzo, co-founded by CEO Harshil Parikh and CTO Harshit Chitalia, was born out of a vision to make enterprise security more efficient and actionable.
“We aim to empower the ‘good guys’ in the battle for cybersecurity,” Chitalia said in an exclusive interview with VentureBeat. “We wish to provide them with the tools needed to protect everyone effectively.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Prioritizing and remediating critical vulnerabilities Tromzo’s mission is to help eliminate the friction between developers and security teams by providing end-to-end visibility, reducing noise, eliminating manual work and driving security ownership. Tromzo’s platform integrates with existing security tools, source code systems, and cloud platforms, and leverages its Intelligence Graph and AI capabilities to prioritize and remediate the most critical vulnerabilities across the software environment.
The company was founded in 2021 by Parikh and Chitalia, who both experienced the pain point of security waste and inefficiency in their previous roles as security and engineering leaders. They decided to start Tromzo to solve this problem, which they believe is becoming more urgent and complex with the adoption of cloud native architectures and DevOps pipelines.
“We truly believe that this is going to be the next generation of software security platforms that we are building,” said Parikh. “We have a unique proposition in the market to lead this category of solutions.” Deep environmental context and intelligence graph Chitalia told VentureBeat that his personal agenda is to protect the good guys from the bad guys, and to help the security folks make the best and most efficient use of their time and work.
“We’re trying to come up with a solution that works as the next generation of ASPM [Application Security Posture Management] with AI,” he said.
Tromzo’s unique approach to ASPM leverages AI to manage the complexity of security data. “AI today can be embedded in pretty much everything,” said Chitalia. “But as with everything else, security is complex. What we’re trying to do is come up with a solution that is working as the next generation of ASPM with embedded AI.” Parikh added: “The intelligence graph is sort of like a graphical view of how the different aspects of things are connected to each other. So how a code repository is connected to a deployed artifact in the cloud, whether it’s an AWS asset or whatnot. Making that connection is realistically what happens in a real security environment.” This interconnected mapping allows Tromzo to provide context around vulnerabilities and prioritize the most critical risks. According to Parikh, the startup’s Fortune 500 customers have used the platform to gain “100% visibility” into their environments and cut down remediation time.
Rapid growth backed by industry titans Tromzo is backed by a cohort of Chief Information Security Officers ( CISOs ) from various industries, providing a wealth of knowledge and validation for its roadmap. “We have champions on our side, which are looking out for us,” said Parikh. “We are building the right product for the right user and also talking about it in the right way.” With the new funding, Tromzo plans to accelerate its growth, further expand its product offerings and solidify its position in the market. The company’s vision and innovative approach have already garnered industry analyst recognition, validating the founders’ efforts and propelling the startup onto an exponential growth trajectory.
“We’re just about getting on that hockey stick and going exponential from here because the market is there, the product is there, and we aim to win this market,” Chitalia confidently concluded.
ASPM named top technology by Gartner Tromzo’s focus on ASPM for cloud services is also gaining recognition from industry analysts. Gartner recently named ASPM as one of the top technologies in its Hype Cycle for Application Security report.
According to Gartner, ASPM analyzes security signals across software development, deployment and operation to improve visibility, better manage vulnerabilities and enforce controls.
This funding round positions Tromzo to effectively capitalize on the rapidly growing ASPM market, offering a unique, AI-powered solution to enterprise security challenges. As the market continues to expand, Tromzo is well-placed to lead the way in delivering efficient, effective and actionable security solutions.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,523 | 2,023 |
"MindsDB raises funding from Nvidia to democratize AI application development | VentureBeat"
|
"https://venturebeat.com/ai/mindsdb-raises-funding-from-nvidia-to-democratize-ai-application-development"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages MindsDB raises funding from Nvidia to democratize AI application development Share on Facebook Share on X Share on LinkedIn Image Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
MindsDB , a database platform facilitating AI-centric applications, today announced the successful conclusion of a $5 million investment round led by NVentures, an arm of Nvidia, with participation from additional investors. This new funding propels MindsDB’s cumulative seed funding to $46.5 million, fortifying the company’s objective of democratizing access to artificial intelligence (AI) for global enterprises.
MindsDB stated that this capital infusion will expedite the company’s mission to integrate AI capabilities into products aimed at the expansive cohort of approximately 30 million software developers spanning diverse industries.
The company has highlighted its platform’s array of over 130 AI integrations, allowing developers to oversee AI models originating from diverse advanced machine learning frameworks like OpenAI, Anthropic, Hugging Face, Langchain and Nixtla.
By facilitating a fusion of these models with data residing within platforms such as Amazon Redshift, Google BigQuery, MySQL, Postgres, MongoDB and Snowflake, the platform assumes a pivotal role in enabling cohesive AI incorporation.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “This new backing from Nvidia signals that the AI revolution will not be limited to companies with fully staffed data science teams and expertise. Every developer worldwide, regardless of their AI knowledge, should be capable of producing, managing and plugging AI models into existing software infrastructure,” Jorge Torres, CEO and cofounder of MindsDB, told VentureBeat. “Our goal is to help solve, enable and inspire the world’s 30 million-plus developers to leverage their data to build AI applications, no matter their data source or [the] machine learning model/framework they want to use.” Torres claims that AI proficiency is a rarity among current software developers, with fewer than 1% possessing this skill. Most of these few proficient individuals are nestled within the ranks of the largest market leaders. This scarcity, he asserts, erects barriers that hinder burgeoning startups as well as small and medium-sized enterprises from harnessing the advantages of generative AI.
In response to these challenges, Torres elucidated that MindsDB’s mission revolves around democratizing AI development, rendering the journey from prototype to production accessible to all stripes of developers without requiring specialized AI training.
The platform aims to empower developers to fashion AI applications directly from existing corporate data reservoirs, erasing the barriers to entry and fostering the adoption of an AI-centric paradigm across companies of varying dimensions.
“Our mission to increase AI accessibility within organizations will only grow in importance as AI fundamentally changes the world,” Torres told VentureBeat. “The new funding will enable us to evolve our product to empower even more developers to build the next generation of AI applications.” Streamlining AI application development for citizen developers The company announced the availability of MindsDB Pro, a service offering dedicated GPU-equipped instances for experimentation and deployment of AI/ML projects via the cloud. With over 150,000 open-source installations, MindsDB said that notable companies, including Bytes, Dumuso, JourneyFoods, Progressify, Precise Finance, and Rize have already employed its services to streamline their product development and internal operations.
MindsDB’s Torres emphasized data’s pivotal role in AI/ML and underscored developers’ need to access the most pertinent AI models to catalyze transformative business applications.
Furnishing users with a comprehensive array of ML/AI frameworks, Torres says, eases user success.
“Our partnerships across the database and AI ecosystems enable users to take advantage of MindsDB’s advanced ML from within these platforms, turning their databases into powerful predictive engines,” explained Torres. “We enable integration of all of the different elements of a company’s data stack to be easily input into AI models and then the output of that AI to be put back into a data source. Our platform is the central hub connecting data sources to the most relevant AI models, enabling the creation of useful AI-powered solutions.” Torres said that the open-source developer community has significantly contributed to advancing the company’s mission of democratizing AI development. Initially, the platform incorporated just a handful of data sources. However, in the past year, the potential has exponentially grown, driven by the power of the open-source community to independently construct integrations.
“We’ve taken a bottoms-up approach because many of our new customers of the managed version of MindsDB — MindsDB Pro — discovered us through our partners or from starting with us through our open-source product,” he said. “Now, we are focused on how to provide reliability and stability when scaling our cloud. For SMBs that often lack dedicated ML engineering teams, our managed services offer a user-friendly interface that allows non-experts to leverage machine learning effectively.” Financial services platform Domuso, for example, used MindsDB to create and implement a reliable ML model using MindsDB’s AutoML solution, supported by machine learning experts.
Domuso engineered predictions and transitioned them into live operations, accomplishing this with its existing team and technological resources. MindsDB claims that the move resulted in a $95,000 reduction in chargebacks over a span of two months, potentially yielding savings of approximately $500,000 annually.
“MindsDB seeks to bring AI development closer to data, simplifying the generation of AI while bridging the gap between AI and the necessary data to unlock its potential,” Torres told VentureBeat. “We’re dedicated to eliminating the complexities of managing multiple AI frameworks. With our unified platform, organizations can seamlessly execute and automate a variety of AI frameworks through our platform’s extensive array of integrations.” Lead investor NVentures joins a consortium of existing investors, including Benchmark, Mayfield, Y Combinator, OpenOcean, and Walden Catalyst in this new funding endeavor.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,524 | 2,023 |
"Microsoft unveils next-gen AI solutions to boost frontline productivity amid labor challenges | VentureBeat"
|
"https://venturebeat.com/ai/microsoft-unveils-next-gen-ai-solutions-to-boost-frontline-productivity-amid-labor-challenges"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Microsoft unveils next-gen AI solutions to boost frontline productivity amid labor challenges Share on Facebook Share on X Share on LinkedIn View of a Microsoft logo on March 10, 2021, in New York.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Microsoft today unveiled a suite of tools and integrations designed to empower frontline workers across the globe. Central to this release is an innovative Copilot offering, which harnesses the capabilities of generative AI to enhance the efficiency and effectiveness of service professionals on the frontline.
The tech giant underscores the considerable magnitude of this workforce, estimating their global count at 2.7 billion, more than twice the number of desk-based workers. These individuals perform diverse roles, from customer-facing associates to dedicated healthcare providers and operational stalwarts who navigate on-site tasks.
Microsoft says that over 60% of these workers grapple with monotonous tasks that detract from more meaningful endeavors. Confronted by mounting challenges stemming from labor shortages, skill gaps and supply chain disruptions, frontline workers have been increasingly tackling complex work demands.
To address these concerns, Microsoft aims to equip frontline workers with the necessary technological support and resources.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! An AI-driven frontline Copilot Key among the new tools is the Copilot integrated into Dynamics 365 Field Service to assist frontline service managers and technicians. Microsoft says the generative AI-driven tool optimizes workflow by automating repetitive tasks — creating work orders, for example.
Other integrations within Microsoft 365 further enhance these capabilities. Microsoft said that service managers will gain the ability to generate, schedule and oversee work orders directly within their workflow in Microsoft Outlook and Microsoft Teams. Simultaneously, frontline technicians will be able to access vital work order information through Teams.
The company also unveiled a new Dynamics 365 Field Service “mobile experience” enabling frontline technicians to cut down on the number of taps for key tasks. This includes Dynamics 365 Guides integration, to provide technicians with step-by-step instructions for tasks, and access to Dynamics 365 Remote Assist, to problem-solve with remote experts in real time using 3D spatial annotations.
“We believe investment in technology for frontline workers will drive positive outcomes for employees, customers and their businesses. Technology can relieve pressures on the frontline that are causing burnout as well as help organizations drive engagement and a sense of belonging that can help increase retention,” Charles Lamanna, CVP of business applications and platform at Microsoft, told VentureBeat.
“Today’s announcements,” he added, “are the first steps we are taking to infuse next-gen AI and data with productivity tools like Dynamics 365 Field Service to help address the challenge of repetitive tasks and burnout. The new AI-powered Copilots use generative AI to automate the repetitive and taxing digital overhead that burdens frontline workers.” Aiding frontline productivity with generative AI Lamanna contends that AI and process automation can alleviate the burden of essential yet exhaustive procedures for frontline workers, enabling them to render swifter, well-informed choices.
And says that the novel Copilot within Dynamics 365 Field Service allows frontline managers, who receive service inquiries via emails, to harness cutting-edge AI in Copilot for the direct streamlining of work order creation from within Outlook.
Copilot will auto-populate pertinent data, including customer escalation summaries, into draft work orders within their workflow. Once preserved, these work orders can be synchronized with Dynamics 365 Field Service.
“With updates coming soon, Copilot will streamline technician scheduling by offering data-driven recommendations based on travel time, availability, skillset and other factors as well as accelerate responses to customer messages by summarizing key details and next steps in email drafts,” explained Lamanna. “Copilot will also become available to assist frontline managers in their flow of work within Microsoft Teams.” Through Copilot in Dynamics 365 Field Service, augmented by Teams collaboration and Dynamics 365 Remote Assist’s mixed reality, Lamanna said that frontline workers will be able to maintain contact with their entire team, ensuring punctual project completion in adherence to timelines.
Teamwork Within Microsoft Teams technicians can now receive and dispense updates and engage with specialists for remote assistance. The freshly introduced Dynamics 365 Field Service (Preview) application within Teams will present essential work order information to frontline technicians and make them conveniently accessible in their home interface.
“Frontline technicians will be able to see upcoming work orders as Tasks, click in to see key details such as location, and easily make updates that sync to Dynamics 365 Field Service,” Lamanna told VentureBeat. “Our new Dynamics 365 Remote Assist app in Teams mobile will enable problem-solving in real time with remote experts using 3D spatial annotations that lock to the physical world.” Microsoft said that the new 365 Copilot enhancements will harness data sourced from an upcoming Shifts plugin within Teams. This will include user and company data, enlisting information from Teams chat, SharePoint, emails and other sources to extract insights.
Frontline managers will be able to source the most recent corporate resources for facilitating the onboarding of new personnel through SharePoint. Additionally, data harnessed from the Shifts plugin within Teams will empower workers to assess available shifts and gain greater visibility into outstanding tasks relevant to their team and location. This visibility is informed by the history of Teams chat and emails.
The company noted that Microsoft 365 E3 and E5, as well as business standard/premium subscriptions, are prerequisites for customers to avail themselves of the benefits of Copilot.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,525 | 2,023 |
"Itseez3D launches Avatar SDK Deep Fake Detector to bolster user identity protection | VentureBeat"
|
"https://venturebeat.com/ai/itseez3d-launches-avatar-sdk-deep-fake-detector-to-bolster-user-identity-protection"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Itseez3D launches Avatar SDK Deep Fake Detector to bolster user identity protection Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Digital identity and 3D graphics company Itseez3D today announced the launch of Avatar SDK , a deep fake detector platform that aims to enable businesses to fortify user security and bolster application integrity.
In response to the increasing prevalence of synthetic avatars and deepfake technology, verifying user identities has become paramount for developing secure and trustworthy applications. Itseez3D said that it recognized this urgency and developed the Avatar SDK Deep Fake Detector to uphold the authenticity of user identities.
The platform empowers facial verification systems and digital identity management platforms with tools to combat the escalating number of fraudulent attempts. By leveraging machine learning (ML) algorithms, the platform can analyze facial features , distinguishing between genuine photos and synthetic 3D avatars.
Trained on real photos and avatar renderings The company claims that traditional deepfake detectors focused solely on detecting images created through neural rendering, or when neural networks directly synthesize images. However, these detectors were insufficient in identifying 3D models rendered with a traditional 3D graphics toolchain.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Our machine learning approach trains on real photos and avatar renderings to detect deepfakes as opposed to traditional facial recognition systems that mostly look at the inner part of the face (the area surrounding the eyes, nose, and mouth),” Victor Erukhimov, founder and CEO of Itseez3D, told VentureBeat. “We analyze the whole head image, including hair and neck.” Avatar SDK aims to ensure that only legitimate users gain access to a given platform by identifying inconsistencies and markers characteristic of deepfakes.
Compatible with diverse platforms Erukhimov stated that the company has packaged Avatar SDK as a Docker container, facilitating its integration into enterprise applications and deployment on organizational servers. Moreover, the SDK is compatible with diverse platforms, including social networking apps, e-commerce platforms and immersive gaming platforms.
“We have deployed our tech as a Docker container, as it enables our customers to inculcate the solution in the same cloud where data is processed, addressing the privacy concerns ,” Erukhimov told VentureBeat. “The Deep Fake Detector analyzes all data in the customer’s cloud, ensuring data doesn’t leave the customer’s storage environment.” Tackling the rise of deepfakes through machine learning Erukhimov revealed that an incident in early January 2023 triggered the product’s inception. The company noticed an unusual surge in traffic to their avatar creation demo from Bangladesh.
Malicious operators were exploiting the demo via YouTube videos to bypass Bangladesh’s facial verification system for the National Identity Card (NID). Although the avatars were not hyper-realistic, they managed to deceive detection systems, raising concerns about potential voter fraud during the upcoming presidential elections.
In response, Itseez3D took proactive measures by blocking Bangladesh’s IPs, notifying the government, and offering a free avatar deepfake detector.
Recognizing the significance of this solution for other organizations, Itseez3D subsequently developed the Avatar SDK.
“We believe that digital identity is very important in today’s world where so much is based on digital verification, including voting and online payments,” said Erukhimov. “Reconstruction of avatars from multiple images or videos helps create a geometrically accurate avatar that malicious actors use today to access bank accounts. Our Deep Fake Detector provides a detection accuracy of over 99% with a false alarm rate under 2% to mitigate such issues and protect confidential data.” What’s next for Itseez3D? Itseez3D stated that it is presently working on creating human-like game-ready avatars from selfies. Additionally, it has formed partnerships with VR developers including Reallusion and Spatial and VR games like Drunkn Bar Fight, to incorporate these avatars into their products.
“We are now working on the next generation of avatars created from selfies that we call MetaPerson,” Erukhimov told VentureBeat. “It takes under a minute to create an avatar that looks like you through a selfie. We are already working with a few existing customers on the integration and believe that these avatars will enable many use cases, including being yourself in an AR/VR game/experience, Metaverse, e-commerce and more.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,526 | 2,023 |
"Insilico Medicine’s generative AI tool inClinico achieves high accuracy in predicting clinical trial outcomes | VentureBeat"
|
"https://venturebeat.com/ai/insilico-medicines-generative-ai-tool-inclinico-demonstrates-high-accuracy-in-predicting-clinical-trial-outcomes"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Insilico Medicine’s generative AI tool inClinico achieves high accuracy in predicting clinical trial outcomes Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Insilico Medicine , a clinical-stage AI drug-discovery company, announced a groundbreaking medical milestone: It successfully predicted Phase II to Phase III clinical trial outcomes using its proprietary generative transformer-based AI tool, inClinico.
The clinical stage accounts for approximately 90% of drug-development failures attributed to issues such as lack of efficacy, safety concerns and the intricacies of diseases and data. These failures lead to trillions of dollars lost and years of effort wasted. In response to this immense failure rate, Insilico developed the generative AI software platform inClinico to forecast the outcomes of Phase II clinical trials.
The platform incorporates various engines that harness the power of gen AI and multimodal data, encompassing text, omics, clinical trial design and small molecule properties. Its training data includes more than 55,600 unique Phase II clinical trials from the past seven years.
The subsequent clinical trial probability model, developed by Insilico researchers, demonstrated an impressive 79% accuracy when validated against real-world trials in the prospective validation set where measurable outcomes were available.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! AI revolutionizing drug development The research, published in the Clinical Pharmacology and Therapeutics journal, showcases the potential of AI to revolutionize drug development and investment decision-making.
The company said that AI engines used in this study have been integrated into the inClinico system, designed to predict clinical trial outcomes. This integration is a key component of the Medicine42 clinical trials analysis and planning platform.
“AI offers an enormous advantage when it comes to processing and analyzing complex data and recognizing patterns,” Alex Zhavoronkov, founder and CEO of Insilico Medicine, told VentureBeat. “Using machine learning and AI, we built models based on various data points related to successfully launched and failed drugs. We then combined these models into our prediction engine inClinico. For every evaluated Phase II trial, inClinico generates a probability of success for proceeding to Phase III.” Zhavoronkov said the validation studies were conducted internally and in collaboration with pharmaceutical companies and financial institutions, demonstrating the robustness of the incline platform. On a quasi-prospective validation dataset, the platform achieved an impressive ROC AUC score of 0.88, a measure of its capability to discriminate between success and failure in clinical trial transitions.
The company claims that the platform’s accurate predictions were tested with a date-stamped virtual trading portfolio, resulting in a 35% return on investment (ROI) over nine months, making it a valuable tool for investors seeking critical technical due diligence insights.
Leveraging generative AI for drug development and discovery Insilico’s Zhavoronkov said that his research group created the starting dataset of Phase II clinical trial data from 55,653 trials pulled from clinicaltrials.gov and various other public sources, including pharma press releases and publications.
This data had to be properly labeled, annotated and linked together; a task performed by biomedical experts, a discriminative transformer and a generative large language model.
A transformer system then mapped these trials to drugs and diseases using a natural language processing (NLP) pipeline based on the state-of-the-art Drug and Disease Interpretation Learning with Biomedical Entity Representation Transformer (DILBERT), which was published at the ECIR 2021 conference.
Zhavoronkov stated that the pharma industry traditionally relied on fundamental academic research and serendipity to generate new ideas and hypotheses. However, the high failure rate indicates that the complexity of diseases and biological mechanisms make it exceedingly challenging to identify successful targets for treating diseases, especially novel targets.
Revealing insights, potential treatments Zhavoronkov asserts that incorporating AI into analyzing large, diverse datasets can reveal insights about disease mechanisms and potential treatments that may not be evident to humans. PandaOmics is a part of the inClinico and assimilates vast amounts of data from clinical trials, drugs and disease information to predict the likelihood of success or failure during the Phase II to Phase III transition.
PandaOmics utilizes various data types such as omics data, grants, clinical trials, compounds and publications to analyze and produce a ranked list of potential targets specific to a disease of interest.
“PandaOmics is a knowledge graph for target identification through which our generative AI platform can find connections between clinical trial success or failure, disease conditions and drug attributes that might elude human scientists,” Zhavoronkov told VentureBeat. “Using this data, we built our model for predicting the Phase II clinical trial probability of success, defined as the transition of drug-condition pair from Phase II to Phase III.” Enhanced predictive capabilities Insilico Medicine has been training inClinico on clinical trials, drugs and diseases since 2014, said Zhavoronkov, who emphasized that by combining multimodal LLMs and other gen AI technologies , the company has significantly enhanced its predictive capabilities.
As a result, inClinico now serves as a tool to guide companies in directing their research funds and expertise toward programs with the highest likelihood of success while enabling them to capture and utilize valuable information from programs that have faced setbacks.
“The ability of inClinico to predict the successful Phase II to Phase III transition drugs, even without prior information related to the clinical relevance of the drug’s action of disease, validates the generative AI models and their ability to build on existing data to predict outcomes for diseases where fewer data is available,” Zhavoronkov explained. “The more data it has, and the more successful outcomes, the better AI becomes at accurate prediction.” What’s next for Insilico? Zhavoronkov expressed strong encouragement regarding the findings, while also acknowledging their basis within a limited dataset. He firmly believes that the system’s sophistication and precision will continuously improve over time, driven by a surge in data and reinforcement, including insights from Insilico’s internal pipeline programs — three of which (for idiopathic pulmonary fibrosis, cancer and COVID-19) have successfully advanced to clinical trials.
Insilico projects that approximately 20 to 25% of trials can be predictably assessed using the inClinico tool with meaningful accuracy. The company aspires to expand its capabilities further, leveraging new laboratory robotics advancements to predict success rates for combination therapies and facilitate the selection of the most effective combinations for targeted therapies.
“We integrate cutting-edge technological breakthroughs into our platform, incorporating AI-powered robotics, AlphaFold and quantum computing,” Zhavoronkov explained. “My grand goal is to see this tool deployed extensively because broader usage will drive further improvement. We employ an approach called Reinforcement Learning from Expert Feedback (RLEF), where the tool’s accuracy improves with the insights we receive from analysts using it for predictions. Currently, we can only predict small molecule first-in-class single-agent targeted therapeutics.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,527 | 2,023 |
"Bud Financial launches Bud.ai, a generative AI platform for hyper-personalized banking | VentureBeat"
|
"https://venturebeat.com/ai/bud-financial-launches-bud-ai-generative-ai-platform-hyper-personalized-banking"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Bud Financial launches Bud.ai, a generative AI platform for hyper-personalized banking Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
AI-driven financial platform Bud Financial (Bud) today announced the launch of Bud.ai , a generative AI platform that aims to empower banks and financial services organizations to enhance their customer engagement.
According to the company, the latest advancement is an improved logic core that generates real-time insights for consumer and corporate users. Bud.ai aims to enable developers, marketers and risk professionals to integrate profound financial insights into their workflows, enhancing customer engagement through meaningful hyper-personalization and matching consumers with the right financial products.
The company asserts that by integrating its large language model (LLM) technology, financial services organizations can unlock the potential of their vast amounts of unstructured data , gaining clearer insights into individual financial positions or a detailed view of a bank’s portfolio, all through a single integration.
Bud also announced that it is expanding its product suite further with Jas, a product built upon the Bud.ai core, that will offer a personalized generative chat interface, giving consumers access to a fully trained AI assistant.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Bud claims the assistant is adept at aiding various facets of the financial journey, from pinpointing suitable credit products to offering financial planning guidance.
“Bud has developed foundational language models using banks’ first-party transaction data and consumer-permissioned, third-party open banking data over the years. [With] the new logic core on top of the existing language models, Bud.ai generates real-time insights for consumer and corporate users,” Ed Maslaveckas, founder and CEO of Bud, told VentureBeat. “Our new generative AI assistant, Jas, helps consumers and banks make sense of their financial data through conversation.” Bud said that the enhanced AI core is now the backbone of its two flagship products: Assess, a lending and affordability solution, and Engage, a money management and personalization solution.
“Clients using Engage have seen a 20% increase in overall engagement within their applications and, in some instances, a 20% increase in the likelihood to take out a new product on their platform,” Bud’s Maslaveckas told VentureBeat. “Likewise, when utilizing Assess, our clients have seen up to an 80% increase in operational efficiency when processing new loan applications, along with a 20% reduction in missed loan payments.” Leveraging generative AI for transactional data intelligence Maslaveckas claims that Bud’s LLM technology will greatly aid financial institutions in converting extensive unstructured data into a transparent portfolio overview, allowing for granular analysis of individual financial positions.
“It enables truly data-driven decision-making and helps to communicate the right product, to the right customer, at the right time. Moreover, it enables financial services leaders to see signs of potential delinquency before they happen so that they can protect against lost revenue,” he explained. “We have integrated Bud.ai over our existing product sets so our clients can begin using it easily through a set of APIs. In the next six months, our clients will be able to also access these products through a no-code solution.” The company said its new generative chat interface, Jas, uses Google’s PaLM 2 LLM, allowing the platform to give consumers transactional data intelligence and intuitive insights through conversational AI programs.
“Through Jas, users gain personalized recommendations and guidance regarding credit products, financial planning and other monetary concerns. The integration of Google’s PaLM 2 language model ensures that the chat interface adeptly comprehends and processes natural language queries, resulting in an intuitive user experience,” said Maslaveckas. “By merging advanced AI technology with transactional data intelligence, our goal is to provide invaluable assistance to users, enabling them to navigate their finances confidently and lucidly.” Maslaveckas added that Jas could function as an “action-bot,” fostering customer engagement by extending recommendations and carrying out tasks on customers’ behalf.
“The challenge today is making sure the actions that go live protect the customer, so setting and adjusting simple budgets and savings goals will most likely be first out of the gate,” he said. “In addition, we see great opportunities to improve overall operational efficiency. By partnering with Bud, financial institutions can deliver proven front-of-the-market solutions without the risk of expensive internal R&D expenditure that lacks guaranteed results.” What’s next for Bud? Bud said that it ensures the reliability and security of its generative AI capabilities through dedicated product and technical efforts that prioritize data security and ethics.
The company said it ensures its commitment to responsible AI development by furnishing dependable and secure generative AI capabilities that uphold the integrity of sensitive financial data. It achieves this by implementing measures and protocols designed to safeguard customer information.
Maslaveckas highlighted the company’s forthcoming objectives, which include developing potential expansion avenues for Bud.ai and associated AI products. He said the focus will remain on incessantly refining Bud’s AI to cater to lenders and customers, augmenting financial wellbeing, and broadening credit accessibility.
“We are exploring the development of specialized AI solutions tailored for specific areas such as risk management, fraud detection and customer support, with the ultimate goal of driving operational efficiency and customer/client outcomes,” Maslaveckas told VentureBeat. “We also look forward to the continued growth and evolution of our partnership with Google, a world-class partner to pave the future of financial services with.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,528 | 2,023 |
"Baidu's ERNIE 4.0 turns one person into 'an AI marketing team' | VentureBeat"
|
"https://venturebeat.com/ai/baidu-launches-new-llm-ernie-4-0-that-turns-one-person-into-an-ai-marketing-team"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Baidu launches new LLM ERNIE 4.0 that turns one person into ‘an AI marketing team’ Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The generative AI race is not letting up. This week, Chinese search and web portal giant Baidu announced the release of ERNIE 4.0 , its new large language foundation model, and previewed a whole host of new software applications built atop it, including one — Qingduo, a creative platform that aims to rival Canva and Adobe Creative Cloud , both of which have recently also added new AI features.
Taking the stage at the company’s annual Baidu World 2023 conference in Beijing, Baidu CEO Robin Li said Qingduo allows “one person to become an AI marketing team.” Li demonstrated the generative capabilities of ERNIE 4.0 on-site. Within a few minutes, ERNIE Bot rapidly generated a set of advertising posters, five advertising copy lines, and a marketing video. It was mentioned that based on this capability, Baidu has launched the AIGC… pic.twitter.com/YkgszBDDVJ Li said ERNIE 4.0 and the user-facing text chatbot, ERNIE Bot — similar to how rival OpenAI’s ChatGPT is the chatbot built atop the company’s GPT foundation model — was built to perform four core capabilities: understanding, generating, reasoning, and memorizing.
In live demos, Li showcased ERNIE Bot’s prowess in understanding complicated and jumbled human requests and responding to them.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Beyond that, it generated a range of content, such as text, images, and videos, in just a few minutes. It even solved complex geometry problems and adapted to new information in real-time while writing a story. “These four core capabilities form the foundation of AI-native applications,” Li said.
Return to the stage The announcement of ERNIE 4.0 was the centerpiece of Baidu World 2023, the search giant’s first in-person gathering following after a four-year hiatus and online-only format due to the COVID-19 pandemic.
In press materials provided to VentureBeat and other journalists, Dr. Haifeng Wang, Baidu’s Chief Technology Officer, reported that ERNIE 4.0 had improved its overall performance by nearly 30% since its beta testing in September. The new model is available in an invitation-only beta at the moment.
Infusing AI across the product suite In a page from Microsoft’s playbook, Baidu further revealed how ERNIE and other AI technology was coming to all its apps.
Baidu Search provides more in-depth information and offers an interactive chat interface for complicated queries.
Like Microsoft Bing Chat and Google Bard , it will “aggregate and summarize information from diverse web sources and present a consolidated answer.” However, Baidu Search also seeks to go further, displaying “text, image and dynamic graph, ensuring answers that are both vivid and concise.” Baidu GBI , the fruit of a recent healthcare data acquisition , is aimed at business intelligence, and can rapidly generate custom business analytics. Baidu’s press materials describe GBI as “an all-knowing, fast-reacting assistant in the business world,” that provides fast summaries of information, and “reduces the heavy lifting of having to go through multiple spreadsheets to retrieve and analyze data.” Infoflow, a workplace app, uses AI to complete office tasks such as scheduling meetings to booking travel with awareness of people’s calendars and availability. It also summarizes meetings and chat conversations with key points.
Baidu Wenku , previously a straightforward document-sharing platform, can now be used to conduct research and provide summaries, as well as generate content including documents and slideshows.
Baidu Maps has been upgraded with an “AI guide will take the initiative to suggest next step of action to users during their trip.” And not to be left out, Baidu Drive , the company’s cloud file storage solution, is getting its own GenAI assistant, YunYiduo, which Baidu claims is “World’s first cloud drive intelligent assistant.” It sounds impressive, allowing users to interact with text or voice to pull up specific files or even information within them, summarizing multiple files or segments of a video, for example.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,529 | 2,023 |
"Arthur unveils Bench, an open-source AI model evaluator | VentureBeat"
|
"https://venturebeat.com/ai/arthur-unveils-bench-an-open-source-ai-model-evaluator"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Arthur unveils Bench, an open-source AI model evaluator Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
New York City-based artificial intelligence (AI) startup Arthur has announced the launch of Arthur Bench, an open-source tool for evaluating and comparing the performance of large language models (LLMs) such as OpenAI ‘s GPT-3.5 Turbo and Meta’s LLaMA 2.
“With Bench, we’ve created an open-source tool to help teams deeply understand the differences between LLM providers, different prompting and augmentation strategies and custom training regimes,” said Adam Wenchel, CEO and cofounder of Arthur, in a press statement.
How Arthur Bench works Arthur Bench allows companies test the performance of different language models on their specific use cases. It provides metrics to compare models on accuracy, readability, hedging and other criteria.
For those who have used LLMs on more than a few occasions, “hedging” is an especially noticeable issue — that’s where an LLM provides extraneous language summarizing or alluding to its terms of service, or programming constraints, such as saying “as an AI language model…,” which is typically not germane to a user’s desired response.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Those are some of the subtle differences of behaviors that may be relevant for your particular application,” Wenchel said in an exclusive video interview with VentureBeat.
Arthur has included a number of starter criteria upon which to compare LLM performance, but because the tool is open source, enterprises using it may add their own criteria to fit their needs.
“You can grab the last 100 questions your users asked and run them against all models. Then Arthur Bench will highlight where answers were wildly different so you can manually review those,” explained Wenchel, adding that the goal is to help enterprises make informed decisions when adopting AI.
Arthur Bench accelerates benchmarking and translates academic measures into real-world business impact. The company uses a combination of statistical measures and scores as well as the assessment of other LLMs to grade the response of desired LLMs side by side.
Arthur Bench in action Wenchel said financial-services firms have already been using Arthur Bench to generate investment theses and analyses more quickly.
Vehicle manufacturers have taken their equipment manuals with many pages of highly specific technical guidance and used Arthur Bench to create LLMs that are capable of answering customer queries while sourcing information from said manuals quickly and accurately, all while reducing hallucinations.
Another customer, the enterprise media and publishing platform Axios HQ, is also using Arthur Bench on its product-development side.
“Arthur Bench helped us develop an internal framework to scale and standardize LLM evaluation across features, and to describe performance to the Product team with meaningful and interpretable metrics,” said Priyanka Oberoi, staff data scientist at Axios HQ, in a statement to VentureBeat.
Arthur is open-sourcing Bench so anyone can use and contribute to it for free. The startup believes an open-source approach leads to the best products, with opportunities to monetize through team dashboards.
Collaborations with AWS and Cohere Arthur also announced a hackathon with Amazon Web Services (AWS) and Cohere to encourage developers to build new metrics for Arthur Bench.
Wenchel said AWS’s Bedrock environment for choosing between and deploying a variety of LLMs was “very philosophically aligned” with Arthur Bench.
“How do you rationally decide which LLMs are right for you?” Wenchel said. “This complements the AWS strategy very well.” The company launched Arthur Shield earlier this year to monitor large language models for hallucinations and other issues.
Correction, Aug. 17: The author mistakenly stated that Arthur was based in San Francisco. The story has been updated and corrected. We regret the error.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,530 | 2,023 |
"Amplitude taps AI to improve data quality, accelerate product analytics | VentureBeat"
|
"https://venturebeat.com/ai/amplitude-taps-ai-to-improve-data-quality-accelerate-product-analytics"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Amplitude taps AI to improve data quality, accelerate product analytics Share on Facebook Share on X Share on LinkedIn Amplitude Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
San Francisco-based Amplitude , the product analytics player that helps companies study consumer behavior to make necessary optimizations and drive better returns, today expanded its core platform with new AI smarts, namely new features called Data Assistant and Ask Amplitude.
The capabilities use large language models (LLMs) to help enterprises consistently measure their data quality and more quickly go from business questions to insights. It marks the latest generative AI effort in the data and analytics space, following similar moves from players like Databricks and Akkio.
“At Amplitude, we want to bring the power of AI and LLMs to everyone who builds products. First in the form of features like Ask Amplitude that help our customers ask questions and learn more quickly from their product data, but eventually in the form of a whole suite of product infrastructure and tools that make building AI products easy,” Joseph Reeve, software engineering manager at Amplitude, said in a blog post.
The offerings are currently being tested and can be tried out by all customers starting today, the company said.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Measuring data quality with Data Assistant As the company explained, the new capabilities target two specific areas within Amplitude: data governance and chart preparation.
The Data Assistant looks at different events being tracked within Amplitude and considers a combination of factors — including the number of queries on each data point and the event volume — to determine an overall data quality score.
Once the score is ready, it evaluates each event against certain pre-defined best practices to come up with automatic bite-sized suggestions — prioritized in terms of impact — to improve quality and make the datasets clearer and more consistent.
For instance, the assistant could suggest grouping similar events into categories, helping other Amplitude Analytics users find the event they’re looking for more easily. Or it could recommend adding descriptions powered by OpenAI’s LLMs to high-priority events.
The user just has to accept the suggestions to have them applied and improve their data governance posture.
Ask Amplitude accelerates product analytics With Ask Amplitude, on the other hand, the platform is simplifying analytics consumption for its users. Until now, people using Amplitude had to construct charts step-by-step in the UI. The process was faster than writing SQL queries but still took some time.
Ask Amplitude is addressing this gap by giving users the ability to go from questions to insights right away. The user just has to ask their question in plain natural language.
“When you ask a question like, ‘Which of my videos has the highest conversion rate from watching to subscribing on iOS?’, the goal of Ask Amplitude is not just to tell you what the latest viral videos are. It’s to teach you how to build a funnel analysis; which events in your taxonomy represent watching videos and subscribing; and which properties contain the video and platform information. The resulting chart is a foundation of knowledge for you to build on and answer all sorts of follow-up questions on your own,” Reeve explained in the blog post.
As the user puts in a query, the offering first uses semantic search to check if an existing chart within the platform answers the question. If not, it uses a series of LLM prompts to convert the question into a JSON definition that can be passed to the company’s custom query engine and render a chart.
More to come While Amplitude customers can test out both these features, it is important to note that this is just the beginning. The company plans to build on this effort and launch more AI-powered capabilities to transform how large enterprise teams drive insights from their product data and make business-critical decisions.
“It’s incredible what advancements in AI and LLMs have enabled in such a short time, but we know Ask Amplitude is only the beginning of how this technology will impact the way we do analytics. We’re rethinking the entire experience of how our customers understand their product data from the ground up — and we could not be more excited about the possibilities,” Reeve said.
So far, Amplitude’s product analytics platform has been adopted by more than 2,300 companies, including Atlassian , Instacart, NBC Universal, Shopify and Under Armour.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,531 | 2,023 |
"Akkio raises $15M to advance no-code AI platform for businesses | VentureBeat"
|
"https://venturebeat.com/ai/akkio-raises-15m-to-advance-no-code-ai-platform-for-enterprises"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Akkio raises $15M to advance no-code AI platform for businesses Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Massachusetts-headquartered Akkio , a company offering a no-code platform to help businesses deploy artificial intelligence (AI) in minutes, today announced $15 million in a series A funding round of funding. The company said it will use the capital to accelerate the commercialization of its platform and deliver an easy-to-use AI assistant to anyone working with data.
“This investment allows us to scale and advance our platform so business analysts can leverage AI technologies to work faster, unlock new insights, and make a bigger impact for their organization,” Abe Parangi, cofounder and CEO of Akkio, said.
The round, which was led by Bain Capital Ventures and Pandome, Inc., takes the total capital raised by Akkio to $18 million. It comes at a time when companies of all sizes are bullish on data and AI and looking for ways to drive maximum business value from them.
How does Akkio help? AI is the need of the hour today, but taking advantage of its capabilities is difficult owing to the engineering complexities involved. Akkio tackles this challenge by providing enterprises with no-code tools to quickly build and deploy AI for tasks like churn reduction, attrition prediction, fraud detection and sales funnel optimization.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Users simply have to choose their dataset and train the neural network. Within minutes, the solution prepares the model, allowing users to ship ML-enabled workflows and features to power internal applications and tools critical to streamlining day-to-day operations and improving business outcomes.
As part of the effort to simplify AI development and deployment, Akkio’s platform also offers AI-driven capabilities that handle certain crucial aspects of working with data. For example, its chat data prep feature allows users to automate data cleaning and preparation (combining columns, summarizing records and performing complex calculations) through natural language chat. This is a critical step before using the data for AI.
Then, with the new Chat Explore capability, data analysts can tap GPT-4 to directly chat with their data, identify patterns and instantly build live charts and visualizations. There’s also a Forecasting model that understands patterns in live data and creates forecasts predicting things like inventory availability and sales and marketing performance.
“From the beginning, we have been laser-focused on…building out an end-to-end solution for working with data. We go from extracting data from the systems in which it is generated or stored, by transforming and analyzing it, to building and deploying ML models. Building the end-to-end system is critical to enabling a self-serve data assistant – our vision is that we are the only tool you need to make better data-driven decisions,” Jon Reilly, COO at Akkio, told VentureBeat.
Strong customer and competition base Currently, Akkio claims that “hundreds of customers,” including Ellipsis Marketing, AngioDynamics and Standard Industries, use its offerings to use AI with their data and improve internal processes.
“We’ve been developing the platform for three years and just started signing up customers in 2022. Our customers range from a 2-person marketing shop to a multi-billion freight management company. This stuff is hard and we’re focused on ease of use. It takes a long time to make complex tech. Now we’re already in the hundreds of customers and this thing is moving,” Reilly added.
The no-code AI development space has been growing gradually, particularly in light of the pandemic and the shortage of data science talent. Other players operating in the same segment are Datarobot, Google AutoML , Obviously AI and Fritz AI.
“While developing, we admired companies like DataRobot who pioneered the space; however, we’re targeting a different segment within it. We’re focused on any small business that wants more actionable value from their data but doesn’t have the data scientist on staff and/or can’t afford hefty consulting fees,” Reilly said.
According to Gartner’s Magic Quadrant report , 65% of application development will take place on no-code and low-code machine learning platforms by 2024.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,532 | 2,023 |
"Proton debuts VPN for Business: here's what it includes | VentureBeat"
|
"https://venturebeat.com/security/proton-debuts-vpn-for-business-enabling-it-departments-to-control-access-to-content-and-apps"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Proton debuts VPN for Business, enabling IT departments to control access to content and apps Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Proton VPN, the virtual private network company created in Switzerland by former scientists at CERN (European Organization for Nuclear Research), today announced a new offering geared toward businesses and organizations.
The aptly named Proton VPN for Business starts at $8.99 per month (on a two-year contract) and enables IT departments to deploy virtual private gateways to control access to content and apps, as well as network segmentation. It also comes with built-in protections against malware , phishing and man-in-the-middle attacks. And it is open-source , allowing experts to independently verify its security claims.
Furthermore, it is designed to support remote and hybrid teams who may need to securely access company databases from around the globe, including on public networks like those at coffee shops, cafes, restaurants, hotels and convention centers, which may on their own leave business users open to malicious activity.
Building upon initial grassroots success Though Proton is already used by more than 50,000 businesses internationally, including Fortune 500 companies, Proton VPN for Business is designed to build upon this initial success (the company was founded in 2017) and create a more standardized, yet flexible, product category for other companies looking to safeguard access to their data.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Proton VPN’s focus on transparency and security, backed by our open source philosophy, has made Proton VPN one of the world’s most popular and trusted VPN services and in recent years, businesses looking for a VPN have increasingly taken note,” said Andy Yen, founder and CEO of Proton. “As a result, even without a dedicated business offering, we have onboarded thousands of business users in the past couple years, so making the offering official today is a natural next step forward for Proton.” Among the features included with Proton VPN for Business are Proton VPN Accelerator, Stealth protocol, and Alternative Routing.
Private gateways and security compliance Proton says organizations using the new Proton VPN for Business can instantly deploy “private gateways” that are “only accessible to specifically authorized members and groups within an organization, making it easy for businesses to limit and segment access.” No hardware is needed — the company’s solution is entirely software-based for customers.
“This also makes it easier for businesses to meet the requirements of security certifications such as ISO 27001, SOC2, and more,” the company wrote in a press release.
In addition, Proton notes that its home base of Switzerland is advantageous for the company and its customers as it abides by strict privacy laws and a tradition of neutrality in its treatment of data.
Proton also offers a more customizable “Proton VPN Enterprise” tier for organizations with specific needs. This tier has variable pricing depending on the requirements of the VPN setup.
With nearly one third of the web’s 5 billion users using a VPN as of 2023, and the total market size estimated at $44.6 billion, according to DataProt , Proton is savvy to debut a new offering in the space.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,533 | 2,023 |
"The Ripple Effect of Regulations: How Policy Is Reshaping the Data Center Landscape | VentureBeat"
|
"https://venturebeat.com/data-infrastructure/the-ripple-effect-of-regulations-how-policy-is-reshaping-the-data-center-landscape"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages The Ripple Effect of Regulations: How Policy Is Reshaping the Data Center Landscape Share on Facebook Share on X Share on LinkedIn Illustration by: Leandro Stavorengo Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Data centers play a significant role in energy consumption, water use and carbon emissions, contributing to the sustainability problem caused by the exponential growth of data. As the demand for accelerated digital transformation increases, the environmental impact of data collection, storage, cloud compute, and artificial intelligence (AI) deployment becomes a concern.
However, with different rules in various jurisdictions, what does this mean for the future of the industry? In a recent discussion, data center expert John Booth, managing director of consultancy Carbon3IT , provided insights on emerging regulations and how they may reshape where workloads are located.
Europe Leads the Way Europe has long been at the forefront of efforts to reduce energy use and emissions. To accomplish this, the European Union (E.U.) passed an updated Energy Efficiency Directive (EED) in the summer of 2023. Under the recast directive, data centers in the E.U. with installed IT power of more than 100 kilowatts will need to publicly report energy performance.
Booth notes the E.U. had implemented a voluntary Code of Conduct for data center efficiency for more than 14 years, but uptake remained low. Only around 550 facilities representing an estimated 140 organizations have participated to date. “Part of the problem was lack of oversight and no real enforcement,” said Booth.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Seeing this as a market failure, the E.U. passed the EED to mandate reporting of energy consumption from large data centers. Facilities must provide metrics like total energy use, renewable energy percentage, water consumption and waste heat reuse to a central registry.
“Part of the overall registration piece will not just be the metrics that are required, but also some information on how you’re actually doing your energy efficiency and energy optimization projects,” says Booth. This will give regulators visibility into current performance and opportunities for improvement.
Once baseline data is collected, Booth expects the E.U. will then incentivize further reductions through subsidies or penalties. Data centers with a Power Usage Effectiveness (PUE) over 1.5, for example, may face fines until that metric is lowered. Others may receive funds to install more efficient equipment.
Regulations Spread Across Borders Booth believes regulations will inevitably become global as concerns over AI’s energy demands grow. The U.S. has taken initial steps.
Several U.S. states are also implementing stricter regulations. In Virginia, pending laws focus on carbon reduction and sustainability through more stringent requirements. Similarly, Oregon has proposed reducing carbon emissions 60% by 2027 for data centers and cryptocurrency mining, imposing fines for noncompliance.
The White House Office of Science and Technology Policy published a report in 2022 on the energy impacts of cryptocurrency that found consumption comparable to conventional data centers. This raises the question of also regulating traditional colocation facilities. In response, Senator Sheldon Whitehouse has proposed draft legislation addressing both crypto and conventional data centers modeled after the EU directive.
In 2016 Lawrence Berkeley National Lab was tasked by the U.S. Senate to study American data center energy use through consultations with stakeholders.
Rather than reinventing the wheel, during the consults Booth advised Berkeley to adopt the metrics already established in the original EU Directive. With the bulk of data centers located in Europe and North America, aligned regulations between the two regions could help establish consistent sustainability standards worldwide. This would prevent a regulatory race to the bottom as workloads shift between unregulated jurisdictions.
China is also tightening regulations, with strict design PUE limits that must be met to obtain operating permits. As the two largest markets, the E.U. and China are demonstrating regulations can drive sustainability where voluntary programs have fallen short.
Impact on Data Center Siting Differing state and local rules in the U.S. have allowed data centers more flexibility to locate in areas with tax incentives and lax oversight, at least for now. But long term, regulations may reshape site decisions.
Booth notes facilities have gravitated to places like Arizona for short-term benefits like subsidies. However, “I can’t personally believe why you’d want to put a data center in Arizona, to be honest, I just can’t see why you’d think that way.” The hot, dry climate is poorly suited from an efficiency and water usage perspective.
As regulations tighten, locations like Spain also face challenges as water-stressed countries. Booth expects the E.U. will formally designate approved “data center zones” with guaranteed access to renewable energy and without causing water or grid stress.
Rather than viewing regulations solely as constraints, forward-thinking companies see them as drivers of innovation. The groups that proactively reduce emissions and invest in more suitable geographic locations will gain a first-mover advantage over late adopters.
As Booth notes, “If I was a betting man, I’ll be buying land in Canada. I’ll be buying land in Iceland. And the Nordics” for the long-term sustainability and data center location opportunity regulations will foster in those regions. Early implementation of best practices could also influence policymakers as rules continue taking shape globally.
Sustainability will increasingly factor into both public policy and client/investor decisions, shifting the data center landscape in unprecedented ways.
Standards will drive the industry While federal laws may be years away, proactive preparation is key. Data center experts recommend creating a reporting compliance strategy, collecting necessary operational data and increasing workload efficiency. Additionally, engaging with industry groups and developing standardized metrics will smooth future regulatory transitions.
Growing regulatory focus on data center energy use and emissions signals the need for enterprise leaders to carefully track policy developments. Strategically optimizing infrastructure and partnering with experts ensures ongoing compliance and competitive advantage amid this transition. As the digital economy expands reliance on data centers, balancing efficiency, transparency and business objectives becomes ever more important.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,534 | 2,023 |
"Lenovo launches new TruScale edge service to bring AI and ML to brick-and-mortar businesses | VentureBeat"
|
"https://venturebeat.com/data-infrastructure/lenovo-launches-new-truscale-edge-service-to-bring-ai-and-ml-to-brick-and-mortar-businesses"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Lenovo launches new TruScale edge service to bring AI and ML to brick-and-mortar businesses Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
PC giant Lenovo isn’t content with just selling you or your company laptops — it also wants to the be provider-of-choice for enterprises looking to run machine learning (ML) and AI models right out in the field where they are already doing business, such as at grocery and retail self-checkout kiosks, or even out on fishing trawlers in the ocean.
That’s why today the company is announcing its new “TruScale for Edge and AI,” a new business offering that bundles Lenovo hardware — the ThinkEdge SE455 V3 server with AMD’s EPYC 8004 series processors — with software, specifically “150+ turnkey AI solutions” provided by Lenovo, enabling customers to get up and running with AI/ML no matter their sector.
“When you’re collecting so much data, you can’t drive everything to the cloud,” said Kirk Skaugen, President of Lenovo Infrastructure Solutions, in an exclusive video call interview with VentureBeat. “This is really about bringing AI to the compute where the data is created,” out in the field.
This new “infrastructure-as-a-service,” as Lenovo terms it, is available on a monthly subscription pricing model to businesses that varies depending on their usage — so they only pay for what they use.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The announcement comes amid fast growing competition in the edge AI market: just last week, startup Sima.ai announced its own no-code AI deployment software and hardware for edge devices exclusively in VentureBeat as well.
Examples of the power of Lenovo’s edge AI platform The real-world applications of Lenovo’s new TruScale for Edge and AI are already apparent.
Pesca Azteca, Latin America’s largest tuna fishing fleet, is already using the platform. By deploying ThinkEdge SE455 V3 servers on its commercial fishing boats and running specialized software atop them, the company has gained real-time insights into various operational aspects such as fuel and food consumption.
“Our crews can stock up with the exact supplies needed for every voyage, avoiding shortages and reducing unnecessary expenses while obtaining accurate, real-time information on the amount of fish caught, propelling more efficient, precise and profitable operations,” said Sergio Alcaraz Pérez, IT Infrastructure Manager at Pesca Azteca, in a statement published as part of Lenovo’s press release on the service announcement.
Another positive case study comes from grocery giant Kroeger, which is using TruScale for Edge and AI to deploy computer vision ML models on cameras at self checkout to catch thieves and erroneous scans in progress, cutting down on lost revenue.
“We’ve won close to 3,000 stores in North America across all of the Kroger brands,” Skaugen said.
A server built to be as quiet as it is powerful A big part of Lenovo’s pitch to businesses is that the ThinkEdge SE455 V3 server was designed not only to handle demanding AI workloads on premises, but to do so quietly, making it a versatile solution for a variety of business settings, from noisy heavier industries and retail to more sensitive settings in healthcare and telecom, where noise can be literally detrimental to the success of the business offerings.
“When you talk to healthcare providers, they’re going to put AI next to their MRI machines and their CT scanners and their X-ray machines,” Skaugen related. “You don’t want a patient in that stressful environment to be hearing some screaming noise” from an overworked server.
Skaugen said that the noise mitigation features were carefully designed by Lenovo engineers in Barcelona, Spain, following a deployment of on-street server cabinets for smart city applications. The cabinets emitted very loud noises originally, annoying residents. But Lenovo’s engineers developed proprietary fans and other acoustic masking tech to keep them much quieter, to the point that they are even quieter than competing servers now.
Most importantly: server aims to unlock data intelligence by enabling businesses to process and analyze data right where it’s generated, especially important in places like the middle of the ocean — or the factory floor — where sending data to the cloud is too slow and unreliable to provide realtime analysis, intelligence, and insights.
Partnering for specialized AI deployments Lenovo is broadening its horizons by entering into strategic partnerships with industry leaders such as AMD, Intel, NVIDIA, and Qualcomm. These collaborations aim to produce specialized AI solutions that can be tailored for various sectors, including manufacturing, healthcare, retail, and public safety. The goal is to leverage cutting-edge technologies like computer vision, audio recognition, and predictive analytics to improve operations and drive business growth.
By focusing on ease of deployment, scalability, and real-world impact, Lenovo is not just selling technology; it’s offering a pathway for businesses to innovate and transform in a data-driven world. With these new services and hardware, Lenovo is taking a giant leap in helping businesses everywhere turn their data into actionable, AI-powered insights, positioning itself as a leader in the next chapter of the AI and edge computing revolution.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,535 | 2,023 |
"Data optimization is a must for maximum efficiency | VentureBeat"
|
"https://venturebeat.com/data-infrastructure/data-optimization-is-a-must-for-maximum-efficiency"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Lab Insights Data optimization is a must for maximum efficiency Share on Facebook Share on X Share on LinkedIn This article is part of a VB Lab Insights series paid for by Capital One.
For cloud-based companies, the ability to leverage nearly unlimited amounts of data can unlock possibilities that lead to more innovative products and experiences for customers. But more data coming from more sources can also lead to challenges.
Maybe you can’t find the right data when you need it. Maybe you’re having trouble accessing the data when you do find it. Maybe unlocking valuable insights from your data is requiring far too much processing. And when cloud data platforms charge for compute based on consumption, as most do, inefficiencies can result in unnecessary expenses that make you deprioritize unlocking value from your data.
When data was stored on-premises, you worried less about managing these inefficiencies. There were restraints like limited compute or processing power that prevented you from overspending. These shackles come off in the cloud, and you’re able to scale infinitely, on demand, anytime you need it. This requires companies to focus on data optimization to manage inefficiencies that are a result of this power or risk an increase in costs.
A quick solution would be to tightly control how employees access and use data. But that can limit the speed at which a business needs data to generate insights and make informed decisions. The smart answer is to give employees access to relevant, high-quality data to help fuel innovation, and focus on strategic optimization to efficiently manage that data.
It’s about balancing cost and performance Prior to optimizing, you need to evaluate what cloud data platform is the right fit for your use case. Once you have decided, think about potential cost implications for the use case (i.e., storage costs, loading costs, computing power) and balance this against the potential value you can derive from that use case. If it’s a fit, you can then focus on managing the inefficiencies.
There will be inefficiencies even when you have made the right decision for your use case. Not everyone in your cloud data platform will be an expert in executing a use case most efficiently, so it’s important to make teachers out of a teaching moment. Focus on visibility, alerting and recommendations. This means providing users with visibility into potential inefficiencies, alerting users to inefficiencies quickly and generating recommendations for optimizing the inefficiency — all while making it a teaching moment.
On the optimization front, there are four areas of data optimization that will help you operate more efficiently, scale faster and get the most value out of your data: 1.
Compute optimization You don’t need the same compute size at all times. Workload varies depending on day of the week and time of day. You can schedule different warehouse sizes based on how queries are expected to run then, potentially saving a lot of resources.
2.
Query optimization A badly written query can take excess compute or scan far too much data than needed. Users can also run the same query more times than needed. These inefficiencies can waste money and slow down potential insights. Educating users on query writing can help avoid unwanted behaviors, and alerting mechanisms can give notice when a query is running for longer than expected or when it shouldn’t be.
3.
Dataset optimization Traditional data modeling techniques impact how you optimize your cloud data platform, so you should pay attention to decisions around star-schema, aggregate tables and materialized view. When managing petabytes of data in the cloud, you need a retention strategy in place to move, archive or permanently purge data after a certain period of time, and how you load data is also a consideration. Are you loading data that people aren’t accessing? Are you loading data in near real-time that is accessed only infrequently? Make sure your dataset management matches your actual use case to avoid unnecessary expenditures.
4.
Environment optimization Be vigilant about controlling costs in the development environment, which can add up. Enforce policies such as not exceeding a small compute in lower environments or shutting down compute immediately when not in use.
There are also times when it makes sense to spend for compute. Certain month-end reports need to run fast. You may need extra compute to meet an SLA. Efficiency matters then, too — because you want to make sure those priority processes aren’t slowed down by something that doesn’t need to be running.
Efficiency can mean different things to different companies. For some, it means running at the lowest possible cost. For others, it means making sure that the most important jobs finish when they need to. For most, it means striking the perfect balance between cost and performance. When companies optimize their cloud data architectures for maximum efficiency, they spend less time managing their data and more time managing their business.
Good tools can help make the job easier Centralized tooling adds another layer of accountability for optimization. Tooling enables users to monitor for unexpected spikes in usage, or to be sure that when extra compute comes online for that big month-end report, it then gets shut down as soon as that report finishes running. Tools can provide broad visibility into data usage, identify new usage patterns and even provide recommendations to address problems quickly and proactively. We built Capital One Slingshot to help us optimize our costs, reduce waste and inefficiencies and accelerate time-to-value of Snowflake, while adhering to governance requirements.
Data will always be changing, and there will always be a need to focus on optimizing your data to reach maximum efficiency. When you’re operating efficiently, you’re able to cut through the noise and get insights from your data. In turn, that helps drive business results like hitting on the right pricing strategy faster than anyone else, being prepared to respond to natural disasters without disrupting business continuity, or seeing patterns that help you catch fraud before it becomes a problem. Ultimately, optimizing your data can help improve performance and drive tangible business value.
Salim Syed is VP and Head of Slingshot Engineering at Capital One Software.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,536 | 2,022 |
"Is Intel Labs’ brain-inspired AI approach the future of robot learning? | VentureBeat"
|
"https://venturebeat.com/ai/is-intel-labs-brain-inspired-ai-approach-the-future-of-robot-learning"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Is Intel Labs’ brain-inspired AI approach the future of robot learning? Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Can computer systems develop to the point where they can think creatively, identify people or items they have never seen before, and adjust accordingly — all while working more efficiently, with less power? Intel Labs is betting on it, with a new hardware and software approach using neuromorphic computing, which, according to a recent blog post, “uses new algorithmic approaches that emulate how the human brain interacts with the world to deliver capabilities closer to human cognition.” While this may sound futuristic, Intel’s neuromorphic computing research is already fostering interesting use cases, including how to add new voice interaction commands to Mercedes-Benz vehicles ; create a robotic hand that delivers medications to patients; or develop chips that recognize hazardous chemicals.
A new approach in the face of capacity limits Machine learning-driven systems, such as autonomous cars, robotics, drones, and other self-sufficient technologies, have relied on ever-smaller, more-powerful, energy-efficient processing chips. Though traditional semiconductors are now reaching their miniaturization and power capacity limits , compelling experts to believe that a new approach to semiconductor design is required.
One intriguing option that has piqued tech companies’ curiosity is neuromorphic computing. According to Gartner , traditional computing technologies based on legacy semiconductor architecture will reach a digital wall by 2025. This will force changes to new paradigms such as neuromorphic computing, which mimics the physics of the human brain and nervous system by utilizing spiking neural networks (SNNs) – that is, the spikes from individual electronic neurons activate other neurons in a cascading chain.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Neuromorphic computing will enable fast vision and motion planning at low power, Yulia Sandamirskaya, a research scientist at Intel Labs in Munich, told VentureBeat via email. “These are the key bottlenecks to enable safe and agile robots, capable to direct their actions at objects in dynamic real-world environments.” In addition, neuromorphic computing “expands the space of neural network-based algorithms,” she explained. By co-locating memory and compute in one chip, it allows for energy-efficient processing of signals and enables on-chip continual, lifelong learning.
One size does not fit all in AI computing As the AI space becomes increasingly complex, a one-size-fits-all solution cannot optimally address the unique constraints of each environment across the spectrum of AI computing.
“Neuromorphic computing could offer a compelling alternative to traditional AI accelerators by significantly improving power and data efficiency for more complex AI use cases, spanning data centers to extreme edge applications,” Sandamirskaya said.
Neuromorphic computing is quite similar to how the brain transmits and receives signals from biological neurons that spark or identify movements and sensations in our bodies. However, compared to traditional approaches, where systems orchestrate computation in strict binary terms, neuromorphic chips compute more flexibly and broadly. In addition, by constantly re-mapping neural networks, the SNNs replicate natural learning, allowing the neuromorphic architecture to make decisions in response to learned patterns over time.
These asynchronous, event-based SNNs enable neuromorphic computers to achieve orders of magnitude power and performance advantages over traditional designs. Sandamirskaya explained that neuromorphic computing will be especially advantageous for applications that must operate under power and latency constraints and adapt in real time to unforeseen circumstances.
A study by Emergen Research predicts that the worldwide neuromorphic processing industry will reach $11.29 billion by 2027.
Intel’s real-time learning solution Neuromorphic computing will be especially advantageous for applications that must operate under power and latency constraints and must adapt in real-time to unforeseen circumstances, said Sandamirskaya.
One particular challenge is that intelligent robots require object recognition to substantially comprehend working environments. Intel Labs’ new neuromorphic computing approach to neural network-based object learning — in partnership with the Italian Institute of Technology and the Technical University of Munich — is aimed at future applications like robotic assistants interacting with unconstrained environments, including those used in logistics, healthcare, or elderly care.
In a simulated setup, a robot actively senses objects by moving its eyes through an event-based camera or dynamic vision sensor. The events collected are used to drive a spiking neural network (SNN) on Intel’s neuromorphic research chip, called Loihi. If an object or view is new to the model, its SNN representation is either learned or modified. The network recognizes the object and provides feedback to the user, if the object is known. This neuromorphic computing technology allows robots to continuously learn about every nuance in their environment.
Intel and its collaborators successfully demonstrated continual interactive learning on the Loihi neuromorphic research chip, measuring about 175-times lower energy to learn a new object instance with similar or better speed and accuracy compared to conventional methods running on a central processing unit (CPU).
Computation is more energy-efficient Sandamirskaya said computation is more energy efficient because it uses clockless, asynchronous circuits that naturally exploit sparse, event-driven analysis.
“Loihi is the most versatile neuromorphic computing platform that can be used to explore many different types of novel bio-inspired neural-network algorithms,” she said, including deep learning to attractor networks, optimization, or search algorithms, sparse coding, or symbolic vector architectures.
Loihi’s power efficiency also shows promise for making assistive technologies more valuable and effective in real-world situations. Since Loihi is up to 1,000 times more energy efficient than general-purpose processors, a Loihi-based device could require less frequent charging, making it ideal for use in daily life.
Intel Labs’ work contributes to neuronal network-based machine learning for robots with a small power footprint and interactive learning capability. According to Intel, such research is a crucial step in improving the capabilities of future assistive or manufacturing robots.
“On-chip learning will enable ongoing self-calibration of future robotic systems, which will be soft and thus less rigid and stable, as well as fast learning on the job or in an interactive training session with the user,” Sandamirskaya said.
Intel Labs: The future is bright for neuromorphic computing Neuromorphic computing isn’t yet available as a commercially viable technology.
While Sandamirskaya says the neuromorphic computing movement is “gaining steam at an amazing pace,” commercial applications will require improvement of neuromorphic hardware in response to application and algorithmic research — as well as the development of a common cross-platform software framework and deep collaborations across industry, academia and governments.
Still, she is hopeful about the future of neuromorphic computing.
“We’re incredibly excited to see how neuromorphic computing could offer a compelling alternative to traditional AI accelerators,” she said, “by significantly improving power and data efficiency for more complex AI use cases spanning data center to extreme edge applications.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,537 | 2,019 |
"Explorium raises $19 million to unify AI model training and deployment | VentureBeat"
|
"https://venturebeat.com/ai/explorium-raises-19-million-to-unify-ai-model-training-and-deployment"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Explorium raises $19 million to unify AI model training and deployment Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Explorium , a Tel Aviv-based startup developing what it describes as an automated data and feature discovery platform, today announced that it’s raised $19 million total across several funding rounds. Emerge and F2 Capital contributed $3.6 million in a seed raise, and Zeev Ventures led a $15.5 million in series A.
The influx of capital comes after a banner year for Explorium, during which says it nabbed Fortune 100 customers in industries ranging from financial services to consumer packaged goods, retail, and ecommerce. “We are doing for machine learning data what search engines did for the web,” said Explorium CEO Maor Shlomo, who together with cofounders Or Tamir and Omer Har previously led large-scale data mining and organization platforms for IronSource, Natural Intelligence, and the Israel Defense Forces’ 8200 intelligence unit.
Explorium’s platform acts like a repository for all of an organization’s information, connecting siloed internal data to thousands of external sources on the fly. Using machine learning, it automatically extracts, engineers, aggregates, and integrates the most relevant features from data to power sophisticated predictive algorithms, evaluating hundreds before scoring, ranking, and deploying the top performers.
Lenders and insurers can use Explorium to discover predictive variables from thousands of data sources, Shlomo explains, while retailers can tap it to forecast which customers are likely to buy each product. “Just as a search engine scours the web and pulls in the most relevant answers for your need, Explorium scours data sources inside and outside your organization to generate the features that drive accurate models,” he added.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Within Explorium, data scientists can add custom code to incorporate domain knowledge and fine-tune AI models. Additionally, they’re afforded access to tools designed to help uncover optimization-informing patterns from large corpora.
“Explorium’s vision of empowering data scientists by finding relevant data from every possible source in scale and thus making models more robust is creating a paradigm shift in data science,” said Emerge founding partner Dovi Ollech. “Working with the team from the very early days made it clear that they have the deep expertise and ability required to deliver such a revolutionary data science platform.” Explorium joins a raft of other startups and incumbents in the burgeoning “auto ML” segment. Databricks just last month launched a toolkit for model building and deployment, which can automate things like hyperparameter tuning, batch prediction, and model search. IBM’s Watson Studio AutoAI — which debuted in June — promises to automate enterprise AI model development, as does Microsoft’s recently enhanced Azure Machine Learning cloud service and Google’s AutoML suite.
IDC predicts that worldwide spending on cognitive and AI systems will reach $77.6 billion in 2022, up from $24 billion in revenue last year. Gartner agrees: In a recent survey of thousands of businesses executives worldwide, it found that AI implementation grew a whopping 270% in the past four years and 37% in the past year alone.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,538 | 2,022 |
"Report: 55% of crypto developers say bear market has increased their desire to build | VentureBeat"
|
"https://venturebeat.com/virtual/report-55-crypto-developers-bear-market-increased-desire-build"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 55% of crypto developers say bear market has increased their desire to build Share on Facebook Share on X Share on LinkedIn Crypto winters truly are for building. That’s according to a survey of crypto developers conducted by Hiro , a Web3 developer toolkit.
The H2 2022 Developer Survey confirms that when speculative activity around crypto cools off, it’s time to get to work. More than half of respondents (55%) said that the bear market has increased their desire to work in Web3 , with most focused on NFTs and DAOs.
Many developers want to see innovation thrive on Bitcoin, with 38% of respondents citing it as their primary motivation to develop on Stacks, a programmable layer on top of Bitcoin.
What’s the most important feature of a blockchain ? According to 60% of the respondents, decentralization is the key feature of the blockchain trilemma of security, decentralization and scalability. Another interesting finding from the survey is that 59% of Stacks developers consider the term “Bitcoin maximalism” a neutral one.
The going gets easier for crypto developers Stacks developers also reported in the Q3 2022 survey that every stage of development has gotten easier since Q1, especially deploying contracts to the network, integrating contracts into applications, and maintaining and monitoring apps and contracts.
While Hiro’s developer community is focused on Stacks, they’re also involved with other blockchain ecosystems. Ethereum, which recently switched to proof-of-stake after the Merge, remains a clear favorite. Cosmos and Lightning saw the most growth among the Stacks developer community, with 85% and 64% increases in activity respectively. Other chains have lost ground among the respondents, with Polygon, Binance Smart Chain, Avalanche and Polkadot seeing drops in popularity.
Methodology Developers from more than 50 countries responded to the survey, with the largest concentrations from Asia-Pacific (34.5%) and North America (30.3%). The survey respondents were mostly (85%) men. Three-quarters were age 25 to 44. A third are employed full-time, part-time or are self-employed in Web3.
Read the full report from Hiro.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,539 | 2,012 |
"Q&A site Quora releases its highly anticipated Android app | VentureBeat"
|
"https://venturebeat.com/social/quora-android"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Q&A site Quora releases its highly anticipated Android app Share on Facebook Share on X Share on LinkedIn The question of when Quora will finally release its Android app has finally been answered. It's today. Quora for Android landed in Google's Play store and the Amazon appstore.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The question of when Quora will finally release its Android app has finally been answered. It’s today.
Quora for Android landed in Google’s Play store and the Amazon appstore.
The new Android app allows you to do almost everything you can on the site (such as answer questions, comment, or rate answers), but it’s tailored for the reading experience on mobile. On the new app, you’ll see the top answers pop-up on your home screen, and you can flip into landscape mode to browse content in the way that’s comfortable for you — a particularly cool feature for Android tablet users.
It’s is a major step for the company, which sees about a quarter of its traffic coming through mobile devices. As you can see from this Quora post , users have been asking for the app for quite some time. As product designer Anne K. Halsall explained in this thread, “Ramping up development takes time,” adding, “We would like to iterate and learn from the iPhone app first.” The app is the handiwork of the startup’s new mobile team, which has been plugging away for the better part of the summer.
Quora, a Silicon Valley-based knowledge-sharing site, is the brainchild of two high-ranking Facebookers: the social network’s first chief technology officer, Adam D’Angelo, and Charlie Cheever, who oversaw the creation of Facebook Connect. In May, it raised $50 million in a round led by Facebook board member Peter Thiel.
Photo credit: milos milosevic /Flickr VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,540 | 2,022 |
"Why API security is a fast-growing threat to data-driven enterprises | VentureBeat"
|
"https://venturebeat.com/security/why-api-security-is-a-fast-growing-threat-to-data-driven-enterprises"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Why API security is a fast-growing threat to data-driven enterprises Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
As data-driven enterprises rely heavily on their software application architecture, application programming interfaces (APIs) occupy a significant position. APIs have revolutionized the way web applications are used, as they aid communication pipelines between multiple services. Developers can integrate any modern technology with their architecture by using APIs, which is highly useful for adding features that a customer needs.
By nature, APIs are vulnerable to exposing application logic and sensitive data such as personally identifiable information (PII), which makes them an easy target for attackers. Often available over public networks (accessible from anywhere), APIs are typically well-documented and can be quickly reverse-engineered by malicious actors. They are also susceptible to denial of service (DDoS) incidents.
The most significant data leaks are due to faulty, vulnerable or hacked APIs, which can reveal medical, financial and personal data to the general public. In addition, various attacks can occur if an API is not secured correctly, making API security a vital aspect for data-driven businesses today.
Why API security is essential API development has astronomically increased over the past few years, fueled by digital transformation and its central role in mobile apps and IoT development. Such growth and a variety of possible attacks make API security highly essential.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! As microservices and serverless architectures have become more widespread, attacks include bypassing the client-side application to disrupt the functioning of an application for other users or to breach private information. Furthermore, broken, exposed or hacked APIs can also lead to breaches of the backend system.
In its API Security and Management report [subscription required], Gartner predicts that by 2023, API abuses will move from infrequent to the most frequent attack vector, resulting in data breaches for enterprise web applications, and by 2025, more than 50% of data theft will be due to unsecure APIs.
“At Gartner, we regularly speak with organizations which have suffered breaches of their APIs,” Mark O’Neill, VP analyst at Gartner, told VentureBeat. “APIs are particularly vulnerable because many security teams are less skilled in API protection. This is particularly concerning for newer API types such as GraphQL.” Given the critical role they play in digital transformation and the access to sensitive data and systems they provide, APIs now demand a dedicated approach to security and compliance.
API security vs. application security API security focuses on securing this application layer and addressing what can happen if a malicious hacker interacts with the API directly. API security also involves implementing strategies and procedures to mitigate vulnerabilities and security threats.
When sensitive data is transferred through API, a protected API can guarantee the message’s secrecy by making it available to apps, users and servers with appropriate permissions. It also ensures content integrity by verifying that the information was not altered after delivery.
“Any organization looking forward to digital transformation must leverage APIs to decentralize applications and simultaneously provide integrated services. Therefore, API security should be one of the key focus areas,” said Muralidharan Palanisamy, chief solutions officer at AppViewX.
Talking about how API security differs from general application security, Palanisamy said that application security is similar to securing the main door, which needs robust controls to prevent intruders. At the same time, API security is all about securing windows and the backyard.
“A weak point in such areas will affect the application. API security, in essence, is a subset of the complete application security without which the application as a whole cannot be secured,” he said.
Erez Yalon, VP of security research at Checkmarx , says that API security is not different from traditional appsec, but it adds more areas that organizations need to pay attention to.
“API-centric architecture has more endpoints that a potential attacker can try to abuse; we call this ‘growth of attack surface,’” he said. “In addition, the way that data is transferred and shared through APIs makes it easy to unintentionally expose sensitive data to prying eyes.” Yalon said that APIs could be made more secure when security is considered from the first step and the first line of code written, instead of added as an additional layer later in the game.
“Every API endpoint needs to be documented, and organizations must have clear guidelines on deprecating old and unused APIs. Making sure an updated SBOM [software bill of materials] exists makes it simpler,” said Yalon.
Critical API vulnerabilities and attacks APIs have quickly established themselves as the preferred method of building modern applications, especially for mobile devices and the internet of things (IoT). However, in the face of constantly changing application-development methods and pressures for innovation, some companies still need to fully grasp the potential risks associated with making their APIs available to the public. Before public deployment, businesses must be wary of these common security mistakes: Authentication flaws: Many APIs reject authentication status requests from a genuine user. An attacker can replicate API requests by exploiting such deficiencies in various ways, including session hijacking and account aggregation.
Lack of encryption: Many APIs lack robust encryption layers between the API client and server. Due to such flaws, attackers can intercept unencrypted or poorly protected API transactions, steal sensitive data or alter the transaction data.
Flawed endpoint security: As most IoT devices and microservice tools are designed to communicate with the server through an API channel, hackers attempt to gain control over them through IoT endpoints. Doing so can often resequence the API order, resulting in a data breach.
Current challenges in API security According to Yannick Bedard, head of penetration testing, IBM security X-Force Red, one of the current challenges in API security is them being tested for safety, as intended logic flows may be challenging to understand and test for if not clearly defined.
“In a web application, these logical flows are intuitive through the use of the web UI, but in an API, it can be more difficult to detail these workflows,” Bedard told VentureBeat. “This can lead to security testing missing vulnerabilities that may, in turn, be exploited by attackers.” Bedard said that as pipelining of APIs becomes more and more complex, there often arises questions of which service is responsible for what aspect of security and at what point the data is considered “clean.” “It is common for services to inherently trust data coming from other APIs as clean, only for it to turn out to not be properly sanitized,” he said.
Bernard says that an example of this was the initial discovery of the Log4J vulnerability, where most companies focused primarily on what they had directly internet-facing.
“Malicious data would eventually flow to backend APIs, sometimes behind many other services. These APIs would, in turn, be vulnerable and could provide the attacker an initial foothold into the organization,” he said.
“The top challenge is discovery, as many security teams just aren’t sure how many APIs they have,” said Sandy Carielli, principal analyst at Forrester.
Carielli said that many teams unknowingly deploy rogue APIs or there may be unmaintained APIs that are still publicly accessible, which can lead to several security hazards.
“API specifications could be outdated, and you can’t protect what you don’t know you have,” she said. “Start by understanding what controls you already have in your environment to secure APIs, and then identify and address the gaps. Critically, make sure to address API discovery and inventory.” Best practices to enhance API security The strength of API security depends entirely upon how one’s data architecture enforces authentication and authorization policies. Thanks to technological advances like cloud services, API gateways and integration platforms now allow API providers to secure their APIs in unique ways. The technology stack on which you choose to build your APIs affects how you secure them.
Several approaches may be used to effectively defend your system against API intruders: API gateway: An API gateway is the foundation of an API security framework since it makes it simple to develop, maintain, monitor and secure APIs. The API gateway can defend against various threats and provide API monitoring, logging and rate limitation. It can also automate security token validation and traffic restriction based on IP addresses and other data.
Web application firewalls: A web application firewall or WAF, acts as a middle layer between public traffic and the API gateway or application. WAFs can offer additional protection against threat actors, such as bots, by providing malicious bot detection, the ability to identify attack signatures, and additional IP intelligence. WAFs can be beneficial for blocking bad traffic before it even reaches your gateway.
Security applications: Standalone security products that support features such as real-time protection, static code and vulnerability scanning, built-time checking, and security fuzzing can also be inculcated within the security architecture.
Security in code: Security code is a form of protection implemented internally into the API or applications. However, the resources required to ensure all the security measures are implemented correctly in your API code can be difficult to apply consistently across all your API portfolios.
The future of API security Roy Liebermann, head of customer success at Surf Security , believes that zero trust can be another alternative to defend against internal and external threats.
“When it comes to APIs, zero trust is relevant for both clients and servers,” he said. “An API-driven application can have an enormous number of microservices, making it difficult for security leaders to track their development and security impact. Adopting zero-trust principles ensures that each microservice communicates with the least privilege, preventing the use of open ports and enabling authentication and authorization across each API.” Liebermann recommends that CISOs extend zero trust to APIs to reduce the risk of hackers exploiting API communication to steal data.
Likewise, Palanisamy says that as zero-trust security and zero-trust architectures gain momentum, API security will be one of the main focus areas, especially with SaaS and other cloud services used today.
“The key is to look at this with an enterprise-wide approach. API security cannot be solved by just focusing on a few applications,” he said.
“We’re most likely going to see a different software paradigm shift in the next five years that combines features from REST and SOAP security. I believe there will be a software development paradigm where features from each method are used to create a combined superior method,” Nabil Hannan, managing director at NetSPI , told VentureBeat. “This combination will take security out of the hands of the developers and allow for better ‘secure by design’ adoption.” Hannan said that the concept of identity and authentication is changing, and we need to move away from usernames and passwords and two-factor authentication, which relies on humans not making any errors.
“The authentication workflow will shift to what companies like Apple are doing around identity management with innovations like the iOS16 keychain. This will be developed through APIs in the near future,” he said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,541 | 2,022 |
"Tidelift raises $27M to secure open-source supply chain | VentureBeat"
|
"https://venturebeat.com/security/tidelift-open-source-supply-chain"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Tidelift raises $27M to secure open-source supply chain Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Today, open-source supply chain security provider Tidelift announced it has raised $27 million as part of a series C funding round led by Dorilton Ventures. The funding will enable the organization to help mitigate health and security issues in open-source software.
Tidelift’s open-source management solution, the Tidelift subscription, provides enterprises with a tool to create, track and manage catalogs of approved open-source components so they can avoid using insecure components in their environments.
The organization also partners with the maintainers of thousands of open-source projects to evaluate the security of components, and gather advice on vulnerabilities.
It’s an approach designed to enable application development teams to quickly identify secure open-source tools while avoiding implementing any vulnerabilities in the environment that unscrupulous attackers could exploit.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Cracking down on open-source vulnerabilities The announcement comes amid an industry-wide crackdown on open-source threats, with the White House Open Source Security Summit II recently taking place earlier this month, and companies including Amazon, Meta, Google, Microsoft, Ericsson, Red Hat and Oracle pledging $10 million annually to help improve open-source security.
Tidelift is one of the providers in the community playing a direct role in securing the open-source supply chain, partnering with the maintainers of open-source projects, and paying them to improve the health and security of their solutions, while providing development teams with a solution for adding new components into the workflow.
“We help developers move fast by streamlining the development process to remote obstacles that slow down application development. Development teams can improve decision making with contextually relevant, maintainer-originated data made available directly in the software development lifecycle,” said cofounder and CEO of Tidelift, Donald Fischer.
“They can also create a catalog of prevetted, approved open-source components that reduces duplicative work and accelerates development,” Fischer said.
The providers addressing open-source supply chain security Tidelift’s investment also coincides with the wider growth of the global security and vulnerability management market, which researchers project will grow from $13.8 billion in 2021 to $18.7 billion by 2026, as more organizations look to secure their environments and the software supply chain against threat actors.
The organization is competing against a range of providers including FOSSA , which raised $23.2 million in funding as part of a series B funding round in 2020, and provides an open-source management platform with zero-configuration scanning for application vulnerabilities, end-to-end third-party code management, and license compliance.
Another key competitor is Snyk , a solution that can automatically identify and remediate vulnerabilities in code, dependencies or containers with security intelligence.
Snyk most recently raised $530 million and achieved an $8.5 billion valuation in September last year, making it one of the biggest providers focusing on securing the software supply chain.
However, one of the key differentiators of Tidelift as a solution in the market is the organization’s partnership with the maintainers of open-source projects.
“We partner with them to ensure projects are enterprise-ready, meeting clearly defined security, licensing and maintenance standards. And we pay them for the additional value they create by maintaining their projects to enterprise standards,” Fischer said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,542 | 2,023 |
"Shift left critical to app security; Build38 raises €13M for trust development kit | VentureBeat"
|
"https://venturebeat.com/security/shifting-left-is-critical-to-application-security-build38-raises-e13m"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Shift left critical to app security; Build38 raises €13M for trust development kit Share on Facebook Share on X Share on LinkedIn A photo of the Build38 team Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Apps are perhaps the weakest link in enterprise security. The era of cloud computing and hybrid work has created an environment where apps are a core target for attackers.
Research has found that 42% of organizations have experienced a security incident related to unpatched mobile apps or devices.
However, more and more providers are emerging to harden app defenses against modern threat actors. One is mobile security and app monitoring provider Build38 , which today announced it has raised €13 million as part of a Series A funding round led by Tikehau Capital.
One of the organization’s main solutions, the Trusted Application Kit (TAK), is a software development kit designed to integrate with Android and iOS apps during the development phase to embed threat detection capabilities into the app.
This funding suggests that shifting security left and embedding in-app protections early in the development process could hold the key to reducing the chance of threat actors exploiting end-user devices at the network’s edge.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Security by design is the way forward Build38’s announcement comes not just as cloud adoption continues to grow, but as more organizations confront the reality of threats that target not just endpoints at the network’s perimeter, but also users’ personal mobile devices in remote working environments.
“In the cloud era, organizations are now aware of and vulnerable to attacks and threats originating from anywhere,” said Christian Schläger, Build38’s CEO. “Whether you’re a car manufacturer in China, a traditional commercial bank in Ghana or a fintech company in the UK, all companies are facing similar threats from attackers taking advantage of the fact that the ‘perimeter’ to penetrate an organization’s network no longer ends where the service provider decides.” He continued: “Attackers can now reach any individual user right in their ‘pocket,’ simply from downloading an app from an app store.” Build38’s in-app protection mitigates these threats by using AI to identify modifications, reverse engineering and code manipulation within the app. This can complement existing controls offered by Google Play and the App Store.
TAK also has the ability to generate events and forward the data to an external SIEM when an app acts suspiciously so that security teams can identify potential breach attempts.
The mobile security market Build38’s tool falls within the mobile security market , which researchers valued at $3.3 billion in 2020 and project will reach $22.1 billion by 2030. This represents a compound annual growth rate (CAGR) of 21.1%.
One of Build38’s main competitors in the space is mobile endpoint and application security provider Zimperium , which Liberty Strategic Capital acquired last year for $525 million.
Zimperium’s tool provides run-time threat visibility, security, and compliance scanning, and can identify when a hacker attempts to tamper with an app. It also integrates with external UEM and XDR platforms.
AppDome is another competitor, designed to secure Android and iOS apps with runtime protection, anti-debugging and anti-tampering capabilities.
The main difference between Build38 and these competitors, Schläger says, is that it combines “a client-based solution (which makes the app ‘self defending’) with a backend component for observability, allowing organizations to monitor and fight against threats in real time, and provides threat reporting for their teams.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,543 | 2,022 |
"Report: Devops teams have higher satisfaction, less burnout with positive security practices | VentureBeat"
|
"https://venturebeat.com/security/report-devops-teams-have-higher-satisfaction-less-burnout-with-positive-security-practices"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: Devops teams have higher satisfaction, less burnout with positive security practices Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
>>Don’t miss our special issue: How Data Privacy Is Transforming Marketing.
<< For the last eight years, Google Cloud and DORA have produced the Accelerate State of DevOps report, hearing from 33,000 professionals along the way. The research focuses on examining how certain capabilities and practices predict the outcomes that we consider central to devops : software delivery performance, operational performance and organizational performance. It also focuses on the factors that underlie other outcomes like burnout and satisfaction with one’s team.
In 2021, Google Cloud found that securing the software supply chain is essential to reaching many important outcomes. With this year’s report, the research dug deeper on software supply chain security, making it a primary theme of the survey and report.
Overall, the report found that the biggest predictor of an organization’s application-development security practices was cultural, not technical: high-trust, low-blame cultures focused on performance were significantly more likely to adopt emerging security practices than low-trust, high-blame cultures focused on power or rules.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Another key finding of the report is that cloud usage is predictive of organizational performance. Companies with software initially built on and for the cloud tend to have higher organizational performance. Those who use multiple public clouds are 1.4x more likely to have above-average organizational performance than those who don’t.
It also found early evidence suggesting that security scanning is effective at finding vulnerable dependencies, resulting in fewer vulnerabilities in production code.
With these findings in mind, the report concludes that the adoption of good application development security practices were also correlated with additional benefits.
Devops teams that focus on establishing these security practices have reduced developer burnout; teams with low levels of security practices have 1.4x greater odds of having high levels of burnout than teams with high levels of security.
Teams that focus on establishing security practices are significantly more likely to recommend their team to someone else.
Supply-chain Levels for Secure Artifacts (SLSA)-related security practices positively predict both organizational performance and software delivery performance, but this effect needs strong continuous integration capabilities in place to fully emerge.
Methodology The target population for this survey was practitioners and leaders working in, or closely with, technology and transformations, especially those familiar with devops. The survey was promoted via email lists, online promotions, an online panel, social media, and by asking people to share the survey with their networks (that is, snowball sampling).
Read the full report from Google Cloud.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,544 | 2,022 |
"Report: 62% of SREs and devops specialists say their biggest challenge is unclear ownership boundaries | VentureBeat"
|
"https://venturebeat.com/programming-development/report-62-of-sres-and-devops-specialists-say-their-biggest-challenge-is-unclear-ownership-boundaries"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Report: 62% of SREs and devops specialists say their biggest challenge is unclear ownership boundaries Share on Facebook Share on X Share on LinkedIn When an application crashes or a page doesn’t load, who is on the hook for fixes? In many organizations, the responsibility is often put on operations or reliability teams — but is that how it should be? Increasingly, modern organizations are moving towards a decentralized culture in their engineering teams, a “you build it, you own it” mentality. Newly released data from Sentry and SlashData reveals this is the preferred path forward from site reliability engineers (SREs) and ops professionals. Of the 140 SREs and devops practitioners surveyed, 62% say they want clearer delineation of application and infrastructure ownership following code deployment.
And why is that? A whopping 74% said the one activity they spend too much time on is chasing down others to resolve application issues. Conversely, 64% said their organization should spend more time analyzing and fixing infrastructure vulnerabilities.
The stakes are high: Loss of customers, reduced productivity It’s vital that organizations get this right, as taking reactive measures to address application errors can become wildly costly to the bottom line. Customers are also quick to leave for competing companies when they are inconvenienced by recurring issues.
Almost half of the SRE/devops practitioners surveyed in Sentry’s Infrastructure vs. Applications Report indicated that application issues result in more time spent on customer service, which significantly impacts SRE/devops teams’ productivity. From a resourcing perspective, 30% of SRE/devops practitioners estimate that their teams lose more than one person-month in productivity per year fixing application issues.
For modern organizations to operate even more efficiently, decentralization is the way to go. While 52% of respondents are currently working in a decentralized environment, Sentry discovered that more than half of those surveyed prefer each department to have more autonomy and control over their environments, and the tooling that’s right for their team.
Methodology Sentry and SlashData, surveyed 140 site reliability engineers (SREs) and devops practitioners with responsibilities within organizations that contain more than 50 employees and live software applications in production.
Read the full report from Sentry.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,545 | 2,023 |
"Highlight launches full-stack application monitoring platform, raises $8M | VentureBeat"
|
"https://venturebeat.com/programming-development/highlight-launches-full-stack-application-monitoring-platform-raises-8m"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Highlight launches full-stack application monitoring platform, raises $8M Share on Facebook Share on X Share on LinkedIn San Francisco-based Highlight , a startup looking to streamline web application observability for enterprises, today announced the public launch of its flagship product: An open-source monitoring platform that provides teams with a comprehensive look into the entire application stack and the problems associated with it in one place. The company also said it has raised $8 million in seed funding.
The investment has been led by Afore Capital and Craft Ventures, with participation from Y Combinator (W23), Neo, Day One Ventures, Worklife Ventures, Fuel Capital and prominent angels including Siqi Chen, Scott Banister, Sahil Bloom, Jordan Segall and Calvin French.
Highlight will invest this capital to build out its product’s functionality and compatibility and to drive community engagement initiatives.
“Our open-source community allows us to unite full-stack development with full-stack observability, empowering engineers to deliver top-notch user experiences with unmatched efficiency,” Jay Khatri, company cofounder and CEO, said in a statement.
Highlight’s front-end and back-end monitoring With the rise of digital-savvy consumers, businesses have become bullish on streamlining their development efforts. However, in today’s dynamic environment, just building a high-quality application is not enough — you also have to maintain and improve it from time to time. This is where the idea of monitoring comes in.
In the last few years, a number of monitoring platforms like New Relic and Datadog have come to the fore. Yet, most of these continue to be limited in scope , focusing on either user-side issues or back-end errors. This forces engineers to manually piece together front-end and back-end data to identify root causes and slows down the entire process of resolution.
Khatri, who previously worked at Google DeepMind, and Vadim Korolik, who was a technical staff member at Pure Storage, witnessed these challenges firsthand in their professional careers and decided to fix the problem by starting Highlight in 2022.
The company provides enterprises with a unified platform that gives a comprehensive view of the entire application stack, covering both front-end and back-end issues (if any). It does this by stitching together three key capabilities: full-stack error monitoring, front-end session replay and logging.
Error monitoring issues real-time alerts about glitches before they escalate, session replay gives developers insight into why the bugs are occurring in their web applications, while logging allows for error search and tracking across the stack. When all three come together, developers can quickly identify and resolve issues — without going through the tiring step of manual collation of data.
“Highlight is built on the OpenTelemetry (OTel) observability framework, making it incredibly easy to integrate,” Khatri told VentureBeat. “Enterprises can either opt for using Highlight’s off-the-shelf SDKs or choose to send OTel directly to Highlight. The adaptability ensures that regardless of their existing systems, integrating Highlight is smooth and hassle-free.” Before the public launch, the company open-sourced the codebase for the platform and saw significant traction, with more than 5,000 GitHub stars and more than 5,000 developers. Early customers of the company included large enterprises and high-growth startups Beeper , Aurora , Focal , WeRecover , FlowClub , Influexer , Puzzle , Cabal , HotPlate , CommandBar , Potion and Levro.
“Our customers save endless hours reproducing issues by being able to visualize their observability data in Highlight and we’ve already had several open-source contributors build out full SDKs for our product,” Khatri added.
Road ahead Following the latest round of funding, Khatri and team plan to build out Highlight by expanding its functionality and compatibility across different areas.
“First, we aim to expand our support for more environments and languages across our SDKs,” he said. “Secondly, we have several products on the horizon, including a tracing and metrics product. And lastly, we’re investing in community engagement, ensuring our open-source roots remain robust and thriving.” According to MarketsAndMarkets , the market for observability tools and platforms is expected to grow from $2.4 billion in 2023 to $4.1 billion by 2028, with a CAGR of nearly 12%.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,546 | 2,023 |
"From concept to reality: Unraveling the ideal features of an event-driven microservice platform | VentureBeat"
|
"https://venturebeat.com/programming-development/from-concept-to-reality-unraveling-the-ideal-features-of-an-event-driven-microservice-platform"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored From concept to reality: Unraveling the ideal features of an event-driven microservice platform Share on Facebook Share on X Share on LinkedIn Presented by Lightbend In today’s dynamic business environment, traditional applications pose significant challenges; hampering agility, scalability, and efficiency — which are all key attributes that organizations strive for to gain competitive advantage.
These applications, typically designed as large, monolithic structures, are complex to manage and update, often leading to increased downtime and hindered innovation. Moreover, their interconnected nature makes it difficult to scale or modify individual components independently, causing substantial bottlenecks and impacting overall performance.
Developers are increasingly pressured to deliver fast, reliable and scalable solutions to keep up with evolving business demands, and traditional applications are proving to be a hurdle in achieving these objectives. Microservices, particularly event-driven ones, offer a promising alternative.
The fundamentals of event-driven microservices Event-driven microservices are a powerful architectural pattern that combine the modularity and flexibility of microservices with the real-time responsiveness and efficiency of event-driven architectures. At their core, event-driven microservices rely on three fundamental principles: loose coupling, message-driven communication and asynchronous processing. These principles combine to create scalable, resilient and highly-performant distributed systems.
By embracing loose coupling, message-driven communication and asynchronous processing, event-driven microservices can efficiently handle complex, dynamic workloads and adapt to the ever-changing requirements of modern applications.
Embracing loose coupling: The key to scalable and resilient event-driven microservices Loose coupling is an essential feature of event-driven microservices that facilitates the separation of concerns and modularity in a distributed system. This design principle helps to minimize the dependencies between individual services, allowing them to evolve and scale independently without impacting the overall system.
In a loosely coupled architecture, services are designed to react only to incoming commands, process them, and emit events.
Developers can create more robust, maintainable and scalable event-driven microservices by embracing loose coupling and designing services that react only to incoming commands, processes and emit events. This isolation allows for greater flexibility and adaptability in changing requirements and growing workloads, ensuring the system remains responsive and resilient.
Leveraging event journals in Kalix: Streamlining event processing and communication In an event-driven microservice architecture, events are crucial in conveying information about changes occurring within the system. Event journals serve as a persistent storage mechanism for these events and as topics for message-driven communication. By writing events to an event journal, services can ensure that the events are durably stored and can be reliably consumed by other interested services.
Kalix, by Lightbend , is a Platform as a Service that simplifies this process by handling the persistence of events to event journals and publishing them to consumers.
By leveraging the capabilities of Kalix, developers can focus on building the core functionality of their event-driven microservices and let Kalix take care of persisting events in journals and publishing them to consumers.
Harnessing message-driven communication in event-driven systems: Events, commands and downstream services Message-driven communication is fundamental to event-driven systems, enabling services to communicate asynchronously and maintain loose coupling. This process involves the interaction between upstream services, events, commands and downstream services in a coordinated manner.
Message-driven communication in event-driven systems is essential for promoting loose coupling, asynchronous processing and scalability. By publishing events from upstream services, transforming them into commands and publishing those commands to downstream services, event-driven systems can efficiently handle complex workloads and adapt to the ever-changing requirements of modern applications.
Streamlining message processing with Kalix actions: From events to commands Kalix simplifies message processing and routing in event-driven systems by utilizing a feature component called Kalix actions which are stateless functions designed to manage the communication flow between services, specifically by subscribing to event producers, transforming incoming events into outgoing commands, and sending those commands to downstream services. Developers can streamline the message processing and routing in their event-driven systems, allowing them to focus on building the core functionality of their services.
Transitioning from synchronous to asynchronous event-driven architectures: Learning from experience Developers and teams are often accustomed to synchronous communication patterns, as they are familiar and intuitive from their experience with object-oriented or functional programming. In these paradigms, objects invoke methods on other objects or functions that synchronously call other functions. This familiarity often leads to adopting synchronous communication patterns between microservices in distributed systems. However, as developers encounter production stability issues and recognize the limitations of brittle synchronous processing patterns, they begin to appreciate the merits of asynchronous event-driven architectures: loose coupling, improved responsiveness, enhanced fault tolerance and scalability.
By embracing asynchronous event-driven architectures, developers can address the limitations of synchronous communication patterns and build more scalable, resilient and efficient distributed systems.
Embracing asynchronous communication with Kalix: Focusing on core application logic Asynchronous communication is a core feature of Kalix, designed to streamline the development of event-driven microservices and simplify message delivery. By handling the complexities related to message delivery, Kalix enables developers to focus on the core aspects of their application: Event producers Transformers that translate events into commands Consumers that process these commands Kalix provides several key features to support asynchronous event-driven processing flows: message delivery management, guaranteed at-least-once message delivery, scalability and performance, streamlined development, flexibility and extensibility.
By focusing on the core application processing logic and letting the platform handle the complexities of message delivery, development teams can create robust and responsive applications that can adapt to the ever-changing requirements of modern software development.
Summary The adoption of event-driven microservices is a strategic move transforming how businesses and developers approach software design and management.
Consider, in the healthcare sector, how event-driven architectures enable hospital networks to monitor patient health data in real time and trigger alerts to healthcare professionals when anomalies are detected. This could save lives by ensuring immediate action in critical situations.
These examples demonstrate how the principles of event-driven microservices, with the aid of platforms like Kalix, can revolutionize a wide range of industries by delivering robust, adaptable and responsive applications.
Read the deeper dive, full version of the blog here.
Hugh McKee is Developer Advocate at Lightbend.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,547 | 2,022 |
"Fermyon brings WebAssembly to the cloud — looks to disrupt container-based app development | VentureBeat"
|
"https://venturebeat.com/programming-development/fermyon-brings-webassembly-to-the-cloud-looks-to-disrupt-container-based-app-development"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Fermyon brings WebAssembly to the cloud — looks to disrupt container-based app development Share on Facebook Share on X Share on LinkedIn Generations of vendors and developers have attempted to create technology that enables organizations to build an application that can run anywhere.
The promise of WebAssembly, which is a nascent open-source technology, is just that. With WebAssembly, developers can potentially write code in the programming language of their choice and then have it run, in a highly optimized approach, in any environment. The promise of WebAssembly has the potential to upend multiple areas of the technology market — including the cloud — which recently has increasingly shifted to a container model that doesn’t always serve every organization’s needs.
The challenges of containers and the promise of WebAssembly attracted Matt Butcher and his cofounders to start Fermyon.
“People were frustrated with the way that containers have to be built specific to the operating system and specific to the architecture,” Butcher told VentureBeat. “My team at Microsoft started accumulating a list of problems like that, that we were being told repeatedly and we tried to come up with solutions in a container ecosystem and we just couldn’t.” Today, Fermyon announced that it has raised $20 million in a series A round of funding. The company also announced the launch of a new cloud service designed to help the development and deployment of applications built with WebAssembly.
The Promise of WebAssembly WebAssembly technology has been iterating at a rapid pace over the last five years.
The technology first emerged in 2017 as a technology that enabled web browsers to run the assembly programming language. Browser vendor Mozilla introduced the concept of the WebAssembly System Interface (WASI) in 2019 which then transformed the technology from something that only runs in the browser to something that can run anywhere. In 2021, the Bytecode Alliance was formed, as a new multi-stakeholder body that includes the participation of Mozilla, Intel, Microsoft, Google and others all looking to advance WebAssembly technology and adoption.
Butcher said that as he sat down with his team to try and find a solution to the challenges that container users face in the enterprise, the path led to WebAssembly.
“WebAssembly is really interesting,” Butcher said. “If you look at what it does in the browser, it’s designed to be a secure sandbox environment to execute untrusted code and all kinds of different languages can compile to that code.” Fermyon spins a new cloud platform Fermyon’s first product, Spin, was launched in March. Fermyon Spin is an open-source developer tool designed to help organizations build applications with WebAssembly.
A few months later in June, the open-source Fermyon Platform was announced as an infrastructure technology to enable WebAssembly to run in the cloud. While Butcher had a lot of experience with Kubernetes as a cloud-native deployment technology, the Fermyon Platform instead uses the Hashicorp Nomad cloud orchestration technology.
The new Fermyon Cloud service, out today, is the next step in the company’s product portfolio. Fermyon Cloud is a managed service, for the deployment of WebAssembly applications. Butcher explained that Fermyon Cloud is not a general purpose application cloud service but rather has a specific focus on microservices , which are often used as building blocks for cloud native applications.
“The developer never has to know what the server architecture is behind the Fermyon cloud or the Fermyon platform,” Butcher said. “All they need to know is if it works on their own machine and compiled fine to WebAssembly, they can be confident they can push that code to the cloud and that it’s going to run.” WebAssembly modules and the future of cloud app development Butcher has high hopes for WebAssembly in the cloud and even is cautiously optimistic that it could one day replace the current container based approach.
A more realistic reality in the short term at least is that WebAssembly modules will be able to run alongside containers, enabling developers to rapidly build applications with both new and existing components.
“I think where the promise will unfold over time on the cloud side and in general for WebAssembly is you can just achieve much better performance and much higher density,” Butcher said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,548 | 2,022 |
"How a skilled advocate can improve SaaS technology transformation | VentureBeat"
|
"https://venturebeat.com/datadecisionmakers/how-a-skilled-advocate-can-improve-saas-technology-transformation"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Community How a skilled advocate can improve SaaS technology transformation Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
This article was contributed by Thomas Donnelly, chief information officer of BetterCloud.
Technology progress is hobbled in the enterprise. It’s time to un-hobble it.
Technology transformation is fundamental to every business that aims to run well and compete successfully. Our organization focuses on helping companies manage technology change and use it to transform — for the better — how departments do business. This gives us a continuous view of what works well in deploying new software-as-a-service (SaaS) apps.
We also experiment internally to improve our own business agility and expansion, and we’re here to explain what we found works best for tech transformation.
While it’s ideal to plan and coordinate new technology imperatives strategically, many IT groups are trapped in reactive mode. As a result, they are off-balance, unprepared, and key considerations get overlooked. We’re going to recommend that you try a new method to leapfrog over the struggles of reactive mode. First, though, let’s look at what tends to go wrong with technology (SaaS, in particular) deployments.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Application deployment projects are often frustrated by waits for some other department to finish a particular task. While you wait, forward movement is blocked. An implementation team can help by organizing and aligning the participants, who are typically busy and don’t see SaaS deployments as their top priority.
Operating departments and their project participants typically have a very “local” perspective, so they are not aware of applications, data, or deployments in other departments. That leads to duplications and avoidable mismatches or incompatibilities.
Most companies are functional hierarchies. That can mean resistance to implementing applications across functions. Marketing is here; finance is separate over there. Sales is on another island. Departmental boundaries make it difficult to sync enterprise data with silos , so data integration takes inordinate effort. Implementers may take the easy route and create a silo for one workgroup. That immediately gives birth to problems that are tough to fix later.
Nearly every company wants to implement more effectively across the enterprise and also avoid data and process siloes. There are often tradeoffs between meeting today’s needs in one department and fulfilling enterprise objectives. The department may prefer a specific app and vendor based on past familiarity, but the company seeks to avoid adding new SaaS vendors when a suitable application is already deployed in another department.
The embedded technology advocate and business analyst for Saas initiatives In our experience, people are the key to technology transformation. Our SaaS initiatives enjoyed better outcomes when we embedded business analysts in departments, where they act as business partners, aligning technology with departmental objectives and corporate strategy. Their job starts with understanding a department’s needs. These analysts, whom we dubbed “embedded technology advocates” or advocates, are good at building trust and managing projects.
An embedded advocate helps with SaaS projects in requirements definition, technology choices, implementation, and user onboarding/privilege assignment. They attend department meetings, understand the challenges, help propose future initiatives and build constructive relationships that bridge between departments, IT and senior management. They work daily to align a department’s apps, data and processes with the rest of the enterprise.
How we went about it We established a group of Advocates (feel free to label them tech guides or simply business analysts) who embed with each functional department. Each advocate typically represents two or three departments at once to the enterprise and IT. They stay busy; an embedded Advocate at a large company may take part in 10 to 30 deployments per year, or more. Some deployments will be multiple rollouts of the same app, but in different departments.
We quickly saw that with advocates, the technology choices and deployments worked better. It was easy to see why. On their own, functional departments like marketing, sales, and purchasing are challenged to plan and complete new SaaS projects, especially when it comes to taking a cross-functional, enterprise view. Advocates quickly became experts in the subject of SaaS selection and implementation.
To find embedded Advocates, you can both hire internally and recruit outside. Look for project management skills and operational experience. Sales operations and Finance experience have worked out well in this “analytic extrovert” role.
Putting embedded tech and SaaS advocates into action Once assigned, the technology advocates meet regularly with their constituent departments and IT, security, and deployment teams. They learn the department’s processes and which data is important.
The Advocates become their department’s primary contact for technology changes.
The Advocate pulls the necessary contributors into a project when needed, which cuts the workload significantly. An Advocate actually makes technology decisions on behalf of the department, or at least plays an influential role in them.
In this kind of structure, the advocate becomes trusted by the departments as well as by IT and security, and can be a trusted guide.
The results are encouraging The results are promising. SaaS implementations and onboarding go faster, without disrupting operational teams. Creation of silos stops. The advocate completes work that department stakeholders don’t have the time, focus, cross-functional knowledge or motivation to do.
We saw the embedded advocate approach reduced the time expended on SaaS implementations and deterred information silos. It helps make departmental data widely accessible and more valuable to the company. Local ‘invisible’ silos can hold information valuable to other departments; they shouldn’t be in the dark about what’s there and how it’s used. Advocates are committed to leveraging data; they work to synchronize and integrate silo data with enterprise resources.
Whether an advocate embeds with two or three departments at once depends on experience, on their creativity, and their problem-solving ability. Advocates smooth the path and accelerate technology transformation. That brings IT to the forefront with more deployment successes, better data utilization, and positive reviews.
The Embedded Technology Advocate approach has been highly effective here. We now recommend it to our customers who may deploy 20, 30, or even more applications yearly. We are excited to see the results it brings for them.
Thomas Donnelly is chief information officer of BetterCloud.
DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,549 | 2,012 |
"With new release, Apprenda bets on the hybrid cloud | VentureBeat"
|
"https://venturebeat.com/business/with-new-release-apprenda-bets-on-the-hybrid-cloud"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages With new release, Apprenda bets on the hybrid cloud Share on Facebook Share on X Share on LinkedIn Apprenda, a startup that claims to transform legacy infrastructure into modern cloud-based architectures, has announced its new 4.0 release.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Apprenda , a startup that claims to transform legacy infrastructure into modern cloud-based architectures, has announced its latest release. The update allows a customer to run hybrid platform as a service (Paas) simultaneously between the data center and its existing cloud infrastructure.
Apprenda makes it easier for developers to deploy and scale their applications; the service itself is the brainchild of a former application developer. Prior to founding the company, CEO Sinclar Schuller was a software developer at global investment bank Morgan Stanley. He told VentureBeat in a phone conversation his team would often receive requests from a colleague on the business side with a “big idea” for a custom app.
CloudBeat 2012 assembles the biggest names in the cloud’s evolving story to uncover real cases of revolutionary adoption. Unlike other cloud events, the customers themselves are front and center. Their discussions with vendors and other experts give you rare insights into what really works, who’s buying what, and where the industry is going. CloudBeat takes place Nov. 28-29 in Redwood City, Calif.
Register today! “It took us three months to write the new [app] out and another three months to deploy it,” said Schuller. “It seemed wrong that we were wasting half the time waiting for an app to be deployed.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! To increase productivity for companies like Morgan Stanley, he set to work on an idea to streamline the process of building these “exotic software infrastructures.” Schuller claims Apprenda shaves about 18 months off development time. The original idea was to develop a private cloud product that big enterprises would run in-house.
After some internal wrangling over the merits of public versus private cloud, Schuller came to the realization that a hybrid cloud approach made most sense.
“Enterprise customers are saying, ‘We want a private cloud to feel like a public cloud,'” he said. Using Apprenda, CIOs can run apps in Windows Azure cloud and other cloud environments that use Windows Server 2012.
With the new release, users can install Apprenda in a data center and point it to a cloud infrastructure like Amazon or Azure to manage them all from one centralized dashboard.
Apprenda has established itself by providing on-premise PaaS solutions to large pharmaceutical companies and financial services firms; it competes with VMware’s Cloud Foundry and new players to the on-premise PaaS market such as AppFog.
To explain his grand vision for Apprenda, Schuller uses an analogy of an old house. Instead of demolishing it and building a new house from scratch, why not remodel it with modern fixtures? The outcome will be the same as a custom home, but you’ll save time and resources. “Companies can migrate an app to the cloud without having to worry about altering it or rebuilding it from scratch,” he said.
Schuller will speak about the growing demand for PaaS at CloudBeat , a conference that highlights the most innovative cloud companies. CloudBeat is unique with its emphasis on customer case-studies. It’s not abstract theories and ideas — executives will reveal their hard-frought solutions to very real technology problems.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,550 | 2,021 |
"Stepsize: Engineers waste 1 day a week on technical debt | VentureBeat"
|
"https://venturebeat.com/business/stepsize-engineers-waste-1-day-a-week-on-technical-debt"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Stepsize: Engineers waste 1 day a week on technical debt Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The average engineer spends 6 hours per week — roughly one day — dealing with technical debt , according to the State of Technical Debt 2021 report from Stepsize , a developer of software development tools. The average time spent on overall maintenance work and legacy systems is 33% — of which more than 50% of the time is spent solely on technical debt. That is time the engineer is not working toward their key goals.
Above: A little less than two thirds — 61% — of engineers said backends tend to contain the most amount of technical debt in the codebase.
Technical debt causes bugs and outages, and slows down the pace of development , 60% of engineers said in Stepsize’s report. This results in productivity loss because the engineers are spending more time dealing with issues related to technical debt and not on issues related to development.
Technical debt is also bad for team morale, 52% of engineers said in the Stepsize report.
Most of the technical debt lives in the backend, specifically in web server endpoints, the engineers said. Company applications, websites, and general infrastructure are other parts of the codebase that accumulate a large amount of technical debt.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! When engineers wind up having to deal with older technologies because of unaddressed technical debt , that impacts customer experience. Developers often feel like they are forced to choose between new features and vital maintenance work that could improve their experience, and this is taking a significant toll. In some cases, technical debt may make it harder to implement new features, forcing clunky workarounds or limited functionality.
Part of the problem is that many companies don’t have processes in place to manage technical debt.
In the survey, 58% of engineers said their companies lacked such a process, and 66% said they believed their team would ship up to 100% faster if they did. To underscore just how important the engineers thought a process would be, 15% said they thought they would be 200% more productive, Stepsize found. Only 2% of engineers believed that having technical debt under control would make no difference for their team velocity.
The data suggests that one way to boost productivity is by paying down technical debt in the application backend and general infrastructure areas of the codebase.
The survey included 200+ engineering team members including developers, engineering leads, and CTOs spanning enterprises, mid-size companies, and startups around the world.
Read the full State of Technical Debt 2021 from Stepsize.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,551 | 2,012 |
"StartApp reaches 150M downloads, brings 10x more revenue to developers | VentureBeat"
|
"https://venturebeat.com/business/startapp-reaches-150m-downloads-brings-10x-more-revenue-to-developers"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages StartApp reaches 150M downloads, brings 10x more revenue to developers Share on Facebook Share on X Share on LinkedIn StartApp , an Israeli startup that promises to bring much-needed revenue to frustrated app developers, today announced it has been downloaded 150 million times.
The highly profitable, fast-growth company is a darling of the Israeli tech scene — this, I noticed, during a trip to Israel this summer where I was introduced to StartApp’s CEO, Gil Dudkiewicz, an entrepreneur-in-residence at local venture capital firm, The Cedar Fund.
Dudkiewicz told me then that the pain-point StartApp is trying to address is simple. Developers struggle to generate revenue from their applications. Even some of the most downloaded apps do not have a strong and sustainable business model, which acts as a deterrent to young developers building the next generation of apps.
As it stands, developers only have two options to make money: They can charge for an application or set up in-app ads.
“The latter option provides an inferior user experience, as most users will only click on the ads by mistake,” the company’s chief executive told VentureBeat.
StartApp offers a platform that allows Android app developers to monetize through downloads, an attractive model for the most popular apps on the marketplace.
As VentureBeat reported in March , StartApp will pay developers $50 for every 1,000 downloads.
The company’s critics have been quick to point out the catch. Isn’t there always a catch? Users who download a StartApp-integrated app will notice a curious search icon appear on their home screen. This is how StartApp makes its money. Whenever a user searches the Web via this search portal, StartApp generates revenue, and developers can take a cut.
They will have been notified and have the right to delete it immediately — but will this taint the user experience? With the sheer volume of downloads and rapidly expanding customer base (StartApp now integrates with 3,500 Android apps), the solution seems to be working for now. To continue to thrive, the company will need to find an alternative to the search box, particularly before expanding to iOs. Apple would not welcome the intrusion on its slick user interface.
StartApp faces strong competition from global mobile ad networks, such as Admob and InMobi.
The founders remain optimistic that the pay-per-download model will be the preferred option. “We have introduced a new take on app monetization, one that enables a clean app for a better user experience,” Dudkiewicz told me in an email interview.
StartApp brings us this news a few short months after its first funding round of $4 million, led by the Cedar Fund and Ascent Venture Partners. Since March, adoption rates have skyrocketed; the app was downloaded a further 100 million times.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,552 | 2,021 |
"Sonatype acquires MuseDev, expands Nexus code analysis platform | VentureBeat"
|
"https://venturebeat.com/business/sonatype-acquires-musedev-expands-nexus-code-analysis-platform"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sonatype acquires MuseDev, expands Nexus code analysis platform Share on Facebook Share on X Share on LinkedIn MuseDev's code analysis provides information about bugs found in the code as a comment in code review.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Sonatype , which provides tools for developers to build better quality software, has acquired code analysis platform MuseDev. The acquisition adds developer-friendly code scanning to Sonatype’s platform to create a “full-spectrum” software supply chain management platform, company CEO Wayne Jackson said.
Modern software development is less about developers writing every single line of code and more about them assembling different components with their own code. This means third-party code is almost always present in an application, and there are multiple ways for bugs to be introduced into the code. Developers have to test their own code to make sure there are no bugs and regularly verify the building blocks don’t contain issues that could affect their applications.
Sonatype makes tools to help developers manage the various building blocks and alerts developers of potential issues that need to be fixed. Historically, Sonatype has focused on scanning open source software for security vulnerabilities and on keeping risky components out of the application, Jackson said. Sonatype’s tools have helped identify security vulnerabilities in code the developers didn’t write, but that could still impact their application.
“As developers take on more responsibility for containers, code, and infrastructure, our mission is to make their lives easier while they make great software,” Jackson said. The way to help “developers optimize the code they write is by delivering directly to the toolchain.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Tools where the developer lives MuseDev’s code analysis platform scans the source code for more than security vulnerabilities. The static analysis tool emphasizes code quality and can identify critical performance and reliability issues in the code, as well as whether there are style issues the could hamper the code’s maintainability.
Developers don’t want security vulnerabilities in their code, but “they also don’t want to get paged in the middle of the night because the application was failing” due to performance issues, MuseDev CEO Stephen Magill told VentureBeat.
Muse integrates its 24 preconfigured code analyzers into GitHub, GitLab, and Bitbucket. The analyzers automatically assess each developer pull request and report any bugs found as comments in code review. The comments include clear guidance on how to fix the bugs, and the analysis considers information flow and thread safety to give developers deeper insight into the code. Developers see all the feedback — from their teammates and from Muse — in one place and are able to fix the issues as part of their normal workflow. There is no need to wait for the security team to run its own assessment and inform developers of the issues that were uncovered.
“Teams adopting this approach are 70 times more likely to fix code quality and security issues,” Magill said.
Muse is pretuned to minimize false-positive results to ensure developers are receiving information about issues that matter the most, which helps developers work more efficiently and write better quality code. “As enterprises look to push their development teams to work faster, it becomes imperative to find ways to help developers to move more quickly by automating crucial but time-consuming tasks like code analysis,” RedMonk principal analyst Stephen O’Grady told VentureBeat.
Full-spectrum software management The acquisition of MuseDev expands the breadth and depth of Sonatype’s Nexus platform because the combination of Muse — a cloud-native source code analysis tool — with Sonatype’s existing tools gives developers more control over their code.
Nexus Container is a developer-friendly container security solution that provides continuous visibility into the composition and management of containers from development to run time. The Infrastructure as Code Pack provides guidance to assist developers in configuring cloud infrastructure and ensuring they are compliant with privacy and security standards such as CIS Foundations Benchmarks, GDPR, and HIPAA.
The pack helps developers fix mistakes in configuration before they are applied to production infrastructure. Nexus Repository makes it easier to host and distribute build artifacts such as Docker containers and code components. The recently released Advanced Development Pack delivers a real-time rating system to help developers select the best open source component suppliers and avoid using multiple versions of the same code. The Advanced Legal Pack, which will be released in a few months, will improve visibility into open source licenses.
Developers will be able to use Sonatype’s expanded platform for all application building blocks, which include first-party source code, third-party open source code, infrastructure-as-code, and containerized code.
“With high-profile attacks on software supply chains making headlines the world over, enterprises are moving to harden their development infrastructure against attackers.
As important as the task is, however, technology leaders don’t want to solve this problem with a complicated patchwork quilt of services, solutions and providers — they want an integrated, end-to-end solution,” O’Grady said.
This kind of integrated code analysis is something enterprises are asking for as they adopt DevOps practices to build and release better quality code and accelerate their digital transformation efforts to improve speed and efficiency. This acquisition and platform expansion positions Sonatype very well among companies that offer various forms of code analysis and scanning, including Checkmarx, Contrast Security, Micro Focus Fortify, Snyk , Synopsys, Veracode, and WhiteSource.
The company has been growing tremendously over the past year. It now counts 70% of the Fortune 100 as customers, supporting more than 2,000 commercial engineering teams. And 12 out of the 15 of the world’s largest banks use Sonatype’s tools, Jackson said. Other customers include various branches of the United States Armed Forces, credit card companies, and technology companies. There are more than 250,000 instances of Nexus Repositories, which translates to nearly 15 million developers using Sonatype’s commercial and open source tools. Private equity and venture capital firm Vista Equity Partners made a majority investment in Sonatype back in 2019 — acquiring more than 50%. Jackson suggested the company could see a potential IPO with the current pace of growth.
Most of the enterprises using Sonatype’s tools are not technology companies in the traditional sense. There are financial services organizations with more developers in-house working on internal applications and proprietary tools than companies such as Apple and eBay, Jackson said. Those enterprises are looking at the entire software development lifecycle, which means they care about things other than security vulnerabilities when considering the health of their applications, such as project and release hygiene, Jackson said.
“Why should [developers] pick a project that hasn’t been updated in years or has bad commit history?” Jackson said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,553 | 2,021 |
"OutSystems raises $150 million to expand the reach of its low-code development platform | VentureBeat"
|
"https://venturebeat.com/business/outsystems-raises-150-million-to-expand-the-reach-of-its-low-code-development-platform"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages OutSystems raises $150 million to expand the reach of its low-code development platform Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
OutSystems , an early developer of a platform for building applications using low-code tools, today announced it has raised a $150 million series E round that brings its valuation to $9.5 billion.
The latest round of funding, led by Abdiel Capital and Tiger Global, will be used to continue expanding the company’s overall go-to-market and product development strategy, OutSystems CEO Paulo Rosado said.
Headquartered in Lisbon, Portugal, OutSystems provides a platform that is employed by thousands of companies and more than 350 partners operating in over 90 countries. With more than 1,300 employees worldwide, the company is engaged in a battle for dominance over tools that abstract away much of the complexity traditionally associated with building applications using frameworks such as Java.
Instead, OutSystems presents developers of varying skill levels with a set of visual tools that enable them to build applications faster. Competitors range from Appian and Salesforce to Mendix, a unit of Siemens. OutSystems also has a somewhat complicated relationship with Microsoft, which in addition to having a long-standing partnership with OutSystems provides its own set of low-code tools.
In general, the adoption of both low-code tools and no-code tools for developing applications has increased since the start of the COVID-19 pandemic. Developers are in short supply, and these tools enable them to build more applications faster at a time when many digital business transformation initiatives have either been accelerated or newly launched. Low-code tools typically differ from no-code tools in that the former allow developers to drop down into a lower-level programming construct when required to enable an application to, for example, scale better.
OutSystems has gained traction among organizations that need to quickly build applications spanning multiple platforms, Rosado said. An application built using the OutSystems platform will span anywhere from three to 17 different systems, he noted. In contrast, low-code tools from, for example, a software-as-a-service (SaaS) application provider such as Salesforce only work on a single platform.
Most enterprise IT organizations are now routinely trying to build applications that span multiple platforms running on-premises and in the cloud, including a wide range of SaaS applications that have been deployed to address a specific requirement, Rosado said. Low-code tools make it possible to maximize investments in professional developers who can now build code that can be reused more easily by other developers irrespective of their skill level, he said.
Rosado noted that most organizations are always refining and extending applications. “Software cannot stay small,” he said. “It always grows.” The level of integration provided by a low-code platform is critical because most organizations don’t know precisely how software will need to be extended in the future, Rosado added. Modern software development is based on a set of Lego principles that allow organizations to mix and match components to create additional business value, he added.
Longer-term, Rosado said OutSystems expects AI to be more widely employed to not only build applications but also automate management of the infrastructure IT organizations still largely operate using manual processes. OutSystems is already making it possible to embed AI capabilities that are based on machine learning algorithms into custom applications.
Of course, the biggest challenge IT organizations may soon face is not the rate at which they can build applications. Rather, as organizations employ low-code tools to build applications faster, the backlog for deploying them continues to grow beyond the current level of DevOps maturity most enterprise IT organizations have been able to consistently achieve and maintain.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,554 | 2,018 |
"Is citizen development a threat to business? | VentureBeat"
|
"https://venturebeat.com/business/is-citizen-development-a-threat-to-business"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Is citizen development a threat to business? Share on Facebook Share on X Share on LinkedIn Presented by Smartsheet Low-code tools are making it easier for citizen developers to create custom business apps that improve productivity and agility. But do they put an organization at risk? Business users have traditionally relied on IT personnel who can write code or manage complex administration tools to build, configure, and modify applications for their specific needs. However, faced with an increasing number of projects and the shrinking number of available developers, IT departments are forced to establish a cut line. As a result, many valid project requests from business units never see the prioritization light of day.
Some companies have responded by teaching their non-technical personnel to code. This do-it-yourself development theme is being catered to by a slew of software vendors under the moniker of “citizen developer” — employees without formal programming training or experience who create apps outside of IT. With minimal coding skills, the thinking goes, non-technical knowledge workers can become citizen developers and (hopefully more quickly) design or configure new applications.
In theory, the citizen development movement can bring positive impact to businesses by: Bringing new capabilities and custom apps online quickly Empowering knowledge workers to build technical workarounds to problems, increasing organizational speed and agility Taking pressure off of stretched thin IT departments, freeing them up to focus on bigger projects Yet, citizen development also complicates matters.
There’s a real threat of novice programmers, without formal training and certifications, hacking solutions that leave vulnerabilities in the wake of new apps. This means more work for enterprise security teams who need to come in and patch vulnerabilities when a hack fails.
Andrew Townley, CEO of Archistry , an IT, business, security, and management consultancy, notes: “Citizen developers are actually a double-edged sword. On one hand, they can support the enterprise IT department by developing business-critical applications that truly enable the business. On the other hand, they often do so in relative isolation of the enterprise IT strategy, meaning that prior oversight and management of applications that eventually become critical to the business is nearly impossible.” The citizen developer: What can go wrong? Consider one anecdotal case recently cited in CSO.
Security director John Britton of VMware was asked to clean-up a shadow IT application that had been deployed at a business. Britton noted that the “citizen developer” who created the application tried to include security in the form of usernames and passwords properly hashed in a database. However, no “forgot-my-password” function was provided.
The result: Frequent requests for manual resetting of passwords flooded into the citizen developer, so he removed the password hash function and passwords were then stored in the clear. Anyone with access to this database potentially had access to employee passwords.
There can also be risks around sufficient support, enhancements, and knowledge transfer. Townley points out, “What happens when Sally finds a job at a new company and she’s the only one who understands the whole application that supports a key business process because she developed it? It’s highly likely that nobody will remember to get her to do a debrief of the application as well — assuming there’s someone on the team willing and able to take it over.” Phillip Dennis, founder and principal of Watkyn , pointed out the knowledge transfer risk when the citizen developer is the only person in the organization who understands the design and maintenance of the app. To mitigate that risk, Dennis suggests requiring code commenting and documentation.
The road ahead: No-code platforms What business users need — and what organizations should focus on identifying and implementing — are high-value applications that are intuitive and enable business users to incorporate business logic without the need to write custom code or engage IT developers. The goal is to benefit from subject matter expertise without the need to overburden already stretched IT teams, and to reduce unnecessary risk.
Alan Lepofsky, VP and Principal Analyst at Constellation Research explains: “Traditionally, knowledge workers had to rely on their IT department to develop and deploy applications. This can often be a lengthy and expensive process. But now the rise of low-code and even no-code solutions, which enable ‘non-developers’ to use drag-and-drop to add fields, buttons, and basic programming logic to forms is enabling people to create applications to assist in their own business processes.” Lepofsky adds, “While these new solutions are easy to create, a level of control is still required, so that areas like corporate branding, compliance, and governance aren’t overlooked.” If your organization already has citizen developers you can work with them to evaluate platforms — some popular ones being Smartsheet , Quickbase , and AppSheet — that deliver ease-of-use and flexibility to business users, along with the necessary controls to reduce organizational risk.
Sponsored posts are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected].
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,555 | 2,013 |
"How Docker turned intricate Linux code into developer pixie dust | VentureBeat"
|
"https://venturebeat.com/business/how-docker-turned-intricate-linux-code-into-developer-pixie-dust"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How Docker turned intricate Linux code into developer pixie dust Share on Facebook Share on X Share on LinkedIn Docker characterizes its simplified Linux containers as a standard method for moving applications from machine to machine.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Every once in a while, a technology comes along that nabs attention. The next thing you know, it’s a mission-critical piece of infrastructure at companies big and small.
Hadoop , MongoDB , and Node.js have gone down this path (as have others). The technology that’s come closest to that desirable status in 2013 might just be the Docker container.
It’s based on open-source technology that emerged in the mid-2000s — Linux containers , which run isolated applications on a single physical server. But a company called Docker has made the technology easier to implement and far more useful. Through Docker, the Linux container has blossomed into a tool that helps developers build one application and easily move it into a testing environment and then a production environment, and then from one cloud to another, all without modifying the code.
In some ways, Docker containers are like virtual machines. But they’re often more lightweight and less demanding on the chips and memory in servers. Plus, the code for building these containers is available for developers to inspect and build on under an Apache open-source license.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Since it became freely available in March, startups have been assembling products based on it, sometimes under the phrase “Docker-as-a-Service,” including Orchard and Copper.io’s StackDock.
Big companies have leapt to embrace Docker containers, too. In making its Infrastructure-as-a-Service (IaaS) public cloud available to all earlier this month, Google said it was adding support for operating system software , including Docker. Red Hat has been moving closer to Docker, too, with support in the new beta version of the Red Hat Enterprise Linux 7 package.
CenturyLink is thinking up a next-generation platform for cloud computing, in a project called CTL-C , and Docker will play a considerable role in it. Fast-growing IaaS provider DigitalOcean provides an application to launch Docker containers in its droplets, the company’s term for virtual servers.
VMware, a company firmly rooted in the virtual-machine camp, provides support for Docker in vSphere, for running virtual machines on physical servers, and the vCloud Hybrid Service , the VMware public cloud that connects to companies’ on-premise data centers. A spokeswoman claimed that in an email to VentureBeat, although the company hasn’t made much noise about Docker.
The latest example came from China, where search company Baidu said its Platform-as-a-Service (PaaS) public cloud, the Baidu App Engine, “is now based on Docker,” according to a press release Docker put out last week.
Baidu likes Docker containers because they handle multiple languages and frameworks and provide for a lower cost of development in comparison with more traditional sandboxes.
What these babies can do And Docker isn’t just another open-source tool to commercialize. Engineers at major companies have been talking about how Docker fits into key workflows.
EBay Now, the company’s fast delivery service , depends on Docker containers for developing and testing purposes, and production use was “coming,” said Ted Dziuba, a senior member of technical staff at eBay, during a talk he gave at a Docker event in July.
“The same container for an application works everywhere,” so long as developers know how to connect containers with one another, he said. And that simplifies life for developers.
Docker containers have also made it easy to set up rich development environments at RelateIQ, a startup with software for keeping on top of sales contacts. John Fiedler, who works on IT operations for the company, wrote about Docker’s use in a couple of recent blog posts and noted that the company will soon start using Docker in production.
Russian search company Yandex relies on Docker containers to isolate applications in its open-source PaaS, Cocaine. Yandex uses Cocaine for internal purposes and as a platform to provide its own internet browser to consumers.
Developers at Rackspace’s email service, Mailgun , and CloudFlare have also publicly discussed Docker, but you get the point. Developers like the container model, specifically Docker’s version, and companies are taking it more seriously.
All of this has happened within just a few months of Docker, the company behind Docker containers, making the code available for developers to check out.
Where containers came from Docker containers started out as internal technology for PaaS provider dotCloud, Docker chief executive Ben Golub said in an interview with VentureBeat. Engineers at dotCloud worked with Linux containers as well as other open-source technologies, such as Linux kernel features called cgroups and namespaces, in such a way that the containers didn’t require so much complexity.
“There was a bunch of sort of arcane languages you needed to learn how to use in order to use LXC,” he said. “We provide a standard API (application programming interface) that made it really easy for developers to take any application and package it inside of a container and that made it really easy for any system administrator to run a server that had 10 or 100 or more containers running on it.” After our conversation, Golub sent an email that explains the need for the technology and shows how customers wanted to use it outside dotCloud’s cloud: In running the dotCloud PaaS, we had a large number of customers creating a large number of applications using, in our case, a fairly large number of different “stacks” that ran on our shared, hosted infrastructure. To some extent, this is a small version of the “matrix from hell” that I described, where you have large numbers of applications, languages, and frameworks that need to run efficiently, stably, and securely across large numbers of different servers. We used the container-related technology that ultimately evolved into Docker to ensure that we could manage this environment.
As we ran dotCloud, it became clear that customers wanted, not just a large number of different stacks, but the ability to use almost any stack. And, they wanted to run not only on our infrastructure, but to flexibly move between any infrastructure: public or private, virtualized or non-virtualized, and across their favorite flavor of operating system. And, they wanted to be able to integrate with their choice of adjacent technologies, such as Chef, Puppet, Salt, OpenStack, etc. We knew that no company could deliver such an all-encompassing solution, but that we could enable an ecosystem to deliver it. That was the genesis of Docker.
And now Docker has “succeeded even beyond our best hopes,” Golub said. No wonder the company changed its name from dotCloud to Docker in October.
“I think we’ve hit upon something that is making the lives of developers and system administrators and CIOs and everybody in between just a heck of a lot easier,” Golub said.
The company won’t just keep providing open-source technology. It still provides its PaaS. But next year, it will introduce new ways to make money off of Docker containers.
“Generally speaking, containers are built in one place and then they are run in hundreds of other places, so you need a central hub to take that Docker container, push it and then lots of other places need to find it and pull it,” Golub said. A hosted service that could do that, he said, is the top priority. Management tools could help administrators keep track of where containers are running, who created them, and how they’re performing.
The company also would like to bring in revenue through professional services, such as commercial support for those who use Docker containers. Partnerships with companies that sell services using Docker containers could be another revenue source, Golub said.No matter how much money flows to Docker because of the container breakthrough, though, it’s worth pausing for a moment to acknowledge the efforts of a company that contributed to application development in a big way this year.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,556 | 2,022 |
"How DevOps platform Zeet accelerates application deployment | VentureBeat"
|
"https://venturebeat.com/business/how-devops-platform-zeet-accelerates-application-deployment"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How DevOps platform Zeet accelerates application deployment Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Today, technical decision-makers at companies, whether big or small, try to look for flexible ways to speed up application development and ensure long-term scalability for their engineering teams. Configuring solutions such as AWS requires a team of in-house experts, which is expensive and takes time to build. In fact, more than 77% of all tech companies run into DevOps challenges across the board, including cost, risks, security, optimization of the deployment pipeline, and scaling.
San Francisco-based Zeet , a platform that solves DevOps challenges and accelerates application deployment for startups, has raised $2 million in a seed round of funding led by venture capital firm Race Capital. The company plans to use the funding, which also saw participation from GGV Capital, Founders Inc, and multiple engineering leaders, to build out its teams across areas ranging from engineering to marketing and meet growing demand from customers.
Founded in 2020 by Johnny Dallas and Zihao Zhang, Zeet strives to tackle these bottlenecks through automation. The solution connects to a cloud account and automates traditionally manual DevOps tasks, allowing a team to quickly go from code to a scalable application.
Automated DevOps deployment In order to use the platform, all a developer has to do is bring their code to the Zeet platform from a GitHub or Docker Hub repository. Once that is done, the solution analyzes the underlying source code and defines the elements required for processing, including which codebase to use for development. Then, it automatically builds the application without requiring the user to cobble together CI tools such as AWS CodeBuild, Jenkins, and Travis CI.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! After this, the application can be transparently deployed across multiple clouds and configured for active monitoring using enterprise tools such as PagerDuty , Datadog , or Dynatrace.
“I want Zeet to be the ultimate DevOps and deployment platform. Our developers and companies use Zeet to deploy across any major cloud vendors including AWS and GCP. We make it super easy to configure your cloud provider — be it AWS or GCP or otherwise,” Dallas told VentureBeat.
“This is a new take on application deployment as our customers are able to have deep visibility into their infrastructure, trust their setup and deployment process, and maintain cloud costs, all with no technical restrictions imposed on them as to what kinds of applications they can deploy. With Zeet they get all of the benefits of an infrastructure tool, all while using their own existing cloud platform,” he added.
The solution, as Dallas explained, has already been adopted by over 17,000 developers and companies, with their team-focused product witnessing 50% month-over-month growth in revenue across the past six months. The cofounder didn’t disclose the exact revenue figures but confirmed that both startups and enterprise customers have been driving the growth.
Competitors In the deployment tooling space, Zeet goes against players such as HashiCorp , Netlify , Render.com , and Heroku.
However, Dallas claims its platform is different from typical PaaS providers because it allows customers to deploy into their own cloud provider.
“We don’t charge an exponentially increasing margin, we work to minimize our customers’ costs,” he said. “Plus, we don’t ask our customers to trust us with their reliability, we show them why their setup won’t break and arm them with tools to combat downtime. For example, with the recent AWS outages, all of Zeet’s customers were able to switch to another region (or cloud provider entirely) to avoid downtime.” Globally, tech companies around the world continue to make huge investments in cloud operations and internal tooling. According to Gartner , spending on public cloud services is expected to grow 23.1% in 2021 to $332 billion, up from $270 billion in 2020. Meanwhile, the total spending across cloud application services is likely to touch $122 billion in 2021 and $145 billion by 2022.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,557 | 2,021 |
"Creatio: Low-code/no-code can boost digital transformation efforts | VentureBeat"
|
"https://venturebeat.com/business/creatio-low-code-no-code-can-boost-digital-transformation-efforts"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Creatio: Low-code/no-code can boost digital transformation efforts Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Businesses face many challenges with digital transformation, but 43% of digital and IT leaders said the core barrier to digital transformation is a lack of skilled resources, according to the State of Low-Code/No-Code 2021 Report from Creatio , a provider of low-code platform for customer relationship management and process management.
Creatio said low-code/no-code technology can bridge the talent gap.
Above: Highlights from the report show that only up to 10% of business processes are fully automated.
Coming out of the COVID pandemic, digital transformation/acceleration is something that has risen to the top of the priority list for companies of all sizes and across all industries. Yet, for all the talk about digital transformation, companies are finding it a challenge to execute as so much of it depends on traditional IT and software/application development (IT resources, developers, data scientists — talent in high demand and low supply).
Underpinned by a survey of 1,000+ IT, digital, and business leaders within Creatio’s global network, Creatio’s State of Low-Code/No-Code 2021 Report uncovers the challenges of digital transformation facing businesses and the role low-code/no-code technology can play to accelerate automation and bridge the ever-widening IT talent gap.
Above: Another finding from the report: 95% of digital and IT leaders said low-code/no-code development is faster than traditional development.
The report shows that 43% of respondents say that lack of skilled resources is the core barrier to digital transformation. With 95% of respondents saying they plan to continue their digital transformation initiatives in progress in 2021, the lack of skilled resources is a growing problem. Of these digital and IT leaders, there is support for the adoption of low-code/no-code technology to bridge this gap and empower business users without coding skills to automate processes and build/customize applications.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! While momentum for low-code/no-code is growing, the solution finds itself faced with challenges of its own. Surprisingly, currently, only 6% of low-code development is done by business users without any IT involvement and 60% claim a lack of experience with low-code platforms is the biggest obstacle in low-code adoption. This signals significant room for better training and opportunity to further flatten the traditional IT hierarchy where IT plays a role in governance, but users of low-code platforms are empowered to make/implement business process decisions on their own without layers of management approvals and requests getting bottlenecked in IT backlogs for months.
Read Creatio’s full report The State of Low-Code/No-Code.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,558 | 2,015 |
"Coding bootcamps: How to jumpstart a coding career (webinar) | VentureBeat"
|
"https://venturebeat.com/business/coding-bootcamps-how-to-jumpstart-a-coding-career-webinar-tomorrow"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Webinar Coding bootcamps: How to jumpstart a coding career (webinar) Share on Facebook Share on X Share on LinkedIn Don’t wait – Click here to listen to this webinar right now.
As the world continues to topple tech breakthroughs over and over, the demand for qualified programmers to fuel the industry grows exponentially.
Not a shocking discovery — after all, those time-saving (or time-consuming) apps didn’t manifest on their own. However, what is shocking is that despite the high demand for tech jobs, there are still people who despite keen desire, are stumped on how to get a leg up in the business.
Either intimidated by the perceived difficulty in learning to code, or financially stretched to support full-time college enrollment after a certain age, would-be programmers are missing out on following their dreams. And learning to code solo isn’t a viable option — it just doesn’t prepare you for the everyday challenges that arise from programming. So, what’s the solution? The answer is actually pretty simple: Coding bootcamps.
Across the nation, coding bootcamps are popping up and providing young tech lovers with the education necessary to mature their skills at a professional level. With a determined focus on programming and seasoned pros assigned for one-to-one mentoring, coding bootcamps are far more beneficial to students than universities that pass the craft off as a general elective. Furthermore, some coding bootcamps provide low-paying alternatives and offer discounts for minorities and women. Not to mention, 75 percent of those who graduated from coding bootcamps found full-time employment with an average salary increase of 44 percent.
Coding bootcamps aren’t just a leg-up for students, but valuable for entrepreneurs as well. It helps educate them on how to build the products they want in a timely fashion, along with introducing them to promising coders to join their company. What’s a better opportunity to find fresh and diverse minds in the industry than the training programs they’re emerging from? In this webinar hosted by Dave Paola , Cofounder and CTO of Bloc, and Prasid Pathak , Bloc’s Director of Marketing, you’ll learn more about the benefits of enrolling in coding bootcamps — benefits that range from financial gain for you or your start-up company, to bettering your skills as a programmer — and the differences between them. It will also show you how to work on your raw ideas after graduation and successfully promote them to your employers. The coveted life of a programmer is actually within reach.
No need to wait – this webinar is live! Register here to listen.
In this webinar, you’ll: Gain insight into what skills you’ll need to get ahead in the tech field Learn how to leverage your great idea into the next billion dollar project Research the next step for your future in coding, whether you want to build apps, games or mobile-driven on-demand services.
Speakers: Dave Paola, Cofounder & CTO, Bloc Prasid Pathak, Director of Marketing, Bloc Moderator: Wendy Schuchart, Analyst, VentureBeat This webinar is sponsored by Bloc.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,559 | 2,015 |
"Coding Bootcamp: Reaching tech’s front line without 50-mile hikes or having to shave your head (webinar) | VentureBeat"
|
"https://venturebeat.com/business/coding-bootcamp-reaching-techs-front-line-without-50-mile-hikes-or-having-to-shave-your-head"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Webinar Coding Bootcamp: Reaching tech’s front line without 50-mile hikes or having to shave your head (webinar) Share on Facebook Share on X Share on LinkedIn Want to chuck the day job and make the next billion dollar mobile app? You gots to have coding skills! We’ve got tips on how to pick the coding bootcamp that’s going to help you rise to the top of the app store.
Watch for free! If you’re looking to break into the tech industry and land a lucrative job as a coder, learning the necessary skills to achieve your goal can be a lengthy (and costly) marathon of college courses filled with numerous instructors, lectures, labs, and textbooks. Indeed, that’s the path Dave Paola took to get his college degree However, Paola’s university experience told him that there’s a better way — that you can get yourself a coding gig after more of a sprint — which is why in 2012 he co-founded Bloc , a company he calls the “first online developer bootcamp.” Paola, Bloc’s CTO, says that he went to a top college, took the necessary classes, and put his nose to the grindstone to get to the finish line. However, looking back, he feels he learned more by digging in and actually doing the work.
“I gained a lot more experience that translated into real-world skills, not by paying attention and being disciplined in the classroom,” Paola says, “but actually by working with a group of smart people on real projects, over and over again. It taught me all sorts of skills that the classroom didn’t teach, like working with other people, collaborating, communicating, and all of the soft skills that come along with that.” With that in mind, Bloc operates under the thinking that people learn better through one-on-one mentorship than in a less-personal one-to-many classroom. When you start in one of Bloc’s bootcamps, you’re paired with one of the company’s 150-plus mentors, all of whom have deep experience in the field you’ll be studying. Paola stated that the mentor also has the freedom to tailor the coursework to the student’s abilities and skills, and emphasized that the mentor/student relationship often outlasts the Bloc bootcamp.
“Everything at Bloc is in support of the student/mentor relationship,” Paola says. “We give folks internally here a lot of leeway to make sure that the mentor does what’s right for the student. In many cases, the mentor is the one who ends up placing the student, which is a really great thing.” To that end, when a course ends, each student is readied for the job market with resume-writing assistance, help in putting together a portfolio, and even job-interview prep. They leave the bootcamp with the necessary skills and practical experience to handle a coding job, but also an understanding of what they need to do to land that job.
Perhaps the biggest motivation for choosing a bootcamp is that you don’t have to quit your day job and move to a different city to take the course. And Bloc’s course materials are updated regularly, with new tech developments being integrated into the course immediately, so you know that you’re getting the most up-to-date training on what you’re interested in, whether it’s learning Ruby, user-interface study, or platform-specific mobile-app instruction.
This webinar will cover all of the essentials you need to know about programming bootcamps, how to go about choosing one (Paola acknowledges Bloc is a good fit for some and not others), and answer any questions you might have about the process.
Don’t miss out! Watch now for free.
In this webinar, you’ll: Gain insight into what skills you’ll need to get ahead in the tech field Learn how to leverage your great idea into the next billion dollar project Research the next step for your future in coding, whether you want to build apps, games or mobile-driven on-demand services.
Speakers: Dave Paola, Cofounder & CTO, Bloc Prasid Pathak, Director of Marketing, Bloc Moderator: Wendy Schuchart, Analyst, VentureBeat This webinar is sponsored by Bloc.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,560 | 2,021 |
"CircleCI: Making life easier for software engineers speeds up innovation | VentureBeat"
|
"https://venturebeat.com/business/circleci-making-life-easier-for-software-engineers-speeds-up-innovation"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages CircleCI: Making life easier for software engineers speeds up innovation Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Making life easier for software engineers can improve the organization’s bottom line and speed up innovation, a recent report from continuous integration and delivery platform provider CircleCI found.
Above: Developer advocates are already at work in various roles. In May 2021, CircleCI searched LinkedIn and found 117,151 results for “developer experience” in the United States.
In a world that increasingly relies on digital products, the role of the developer is growing in importance within the business matrix. Engineering teams need a leader — a Developer Experience Engineer (DXE) — who ensures developers have the right tools, processes, and environment to maximize productivity and create the greatest business value possible, CircleCI said in its report. There is growing awareness developer velocity, productivity, and happiness are cornerstones of successful businesses, and that DXEs play an important role on development teams. Without DXE expertise, engineers spend time on maintenance and workflow optimization instead of building, which is less efficient than having a person with centralized authority handle maintenance.
Recent McKinsey research found that businesses that prioritize developer velocity have four to five times the revenue growth of their counterparts.
CircleCI identified six valuable ways a DXE can enhance developer experience — from ensuring developer flow to bringing leadership closer to engineering teams — which ultimately improves business success.
Gain meaningful value from talent.
The average cost of a developer minute in Silicon Valley is about $1.42. That’s every minute a developer is in a seat and the meter is running, and yet somehow organizations are rife with productivity killers.
Developers in flow.
Distractions can make or break a developer’s productivity. Everything from email and Slack to the tools developers use to build and test can take a developer out of the flow state — reducing productivity and increasing costs and toil.
Solving interesting problems.
Developers want to work on interesting problems but often the work doesn’t meet this standard. Some of the less cutting-edge work developers are tasked with — updating plugins or investigating and fixing flaky tests — can be reduced or resolved by leveraging the right automation tools — with the expertise and direction of a DXE.
Ensuring work has meaning.
Getting developers closer to the end customer and the challenges their product helps to solve is what connects them to the company mission. Too often, teams can lose sight of their organization’s mission and the value they deliver to their customers. Lifting developers out of daily toil by solving real and difficult challenges, helping them ship quality products faster, helps bring the team closer to the end customer, and highlights how they are helping improve the experiences and lives of their users. Everyone benefits and team satisfaction is boosted.
Bring buying decisions closer to the engineering team.
At many organizations, tools and engineering solutions are largely decided upon by managers, are far removed from the core needs of the developers, and focused on cost rather than value. Tool decisions are made at levels removed from the engineers who use them, at the same time that an abundance of new tooling options are available. Decision paralysis may happen either way but DXEs with their experience and focus can overcome this risk. A DXE can bridge the gap between the top of the organization and developers that are doing the work, offering holistic benefits.
Bring leadership closer to the engineering team.
Measuring and optimizing engineering velocity is the primary goal, as well as the need to capture and report on engineering success and how that maps onto business value. The leadership level benefits from having a context-switching DXE in the engineering department who will translate engineering success into business value.
CircleCI recommends that developer experience managers have these qualifications: Experience managing software development teams A deep understanding of modern development practices and tools Ability to establish team objectives aligned to business goals Is a process expert that can organize and disseminate information Ability to make decisions The report also suggests that a DXE should focus on these business outcomes: Revenue growth Improved end-user experience Increased quality of releases Engineering team efficiency The emergence of the DXE as a standard role will unleash the power of developers across every type of organization and in every industry, promising to increase productivity, efficiency, and product quality. For organizations looking to create resilient teams, tools, and infrastructure to combat the next inevitable disruption to the industry: it starts with the DXE.
Read the full Why Developer Experience Engineers are the key to accelerating your business report from CircleCI.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,561 | 2,015 |
"Building fintech apps: How to capitalize on the change sweeping the financial sector (webinar) | VentureBeat"
|
"https://venturebeat.com/business/building-fintech-apps-how-to-capitalize-on-the-change-sweeping-the-financial-sector-webinar"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Webinar Building fintech apps: How to capitalize on the change sweeping the financial sector (webinar) Share on Facebook Share on X Share on LinkedIn A rapid transition toward new and disruptive financial technology is well underway and under heavy influence by two major forces: sweeping technological advances as exemplified by wearables and machine learning, and significant generational shifts driven by millenials and now Gen Z.
Technological innovation and, in particular, consumer-related ones like those advancements in mobile and wearable technology, have driven customer expectations for every sector sky-high including fintech. This could pose a significant threat for the unprepared.
In recent years, digital biases of millennials have also radically shifted expectations in the fintech sector. They want financial tools that eliminate friction and allow them to track their money on mobile devices as easily as they can now track each step they take. But that may look tame compared to meeting the growing needs of Gen Z.
Don’t miss out! Register here for free.
In this not-to-be-missed webinar for financial tech entrepreneurs, we’ll look at exactly what’s needed to be part of this change — and to not be left behind.
Certainly, the competitive edge for both fintech startups and financial institutions will be in the art and science of technology — characterized by strong ‘user adaptability’ and ‘intuitiveness.’ Built-in interfaces need to recognize user preferences and behave accordingly. And they need to go beyond just eye-catching charts and graphs and instead provide predictive user insights based on past user behavior.
And to get there, collaboration cannot be underestimated. While digital giants and fintech startups are often depicted as poised to gobble up bank customers, and financial institutions have been commonly considered to be blind and slow, both are finally seeing the mutual benefit of working together.
Join us for an inside look at what it will take to capitalize on this collaboration opportunity.
What you’ll learn: What’s driving change in fintech How to prepare for the coming wave of new technology How to prepare for Generation Z Why collaboration is the path forward to create killer fintech applications Speakers: Josh Gordon-Blake , SVP of Global Partnerships at Pangea Dr. Richard M. Smith , Founder and CEO of TradeStops Elizabeth Gunderson , Senior Manager of Customer Success at Yodlee Interactive The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,562 | 2,015 |
"Breaking into tech: How coding bootcamps lead to today's most coveted jobs (webinar) | VentureBeat"
|
"https://venturebeat.com/business/breaking-into-tech-how-coding-bootcamps-lead-to-todays-most-coveted-jobs-webinar"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Webinar Breaking into tech: How coding bootcamps lead to today’s most coveted jobs (webinar) Share on Facebook Share on X Share on LinkedIn Join us for this live webinar on Thursday, October 15 at 10 a.m. Pacific, 1 p.m. Eastern.
Register here for free.
With consumers constantly searching to satisfy their technological hunger, the pace of the tech boom is unrelenting.
So it’s no surprise there’s an incredibly high demand for qualified programmers. However, despite the potential for meteoric tech career paths, not everyone knows how to get their foot in the door.
That’s starting to change thanks to the emergence of specialized bootcamps dedicated to teaching aspirants how to become a competent programmer in as little time as six months.
Jordan Collier , who switched jobs from a Chick-fil-A employee to a programmer for Allstate, owes his annual $90,000 income to enrolling in these coding bootcamps. Collier isn’t the only one experiencing success, as a study conducted by Course Report showed 75 percent of graduates found full-time employment with an average salary increase of 44 percent — encouraging results for anyone afraid their hard word and dedication will be for not in the end.
Enrolling in a coding bootcamp doesn’t just raise your chances of being hired in the tech industry by a significant margin, it also prepares you for the long road ahead as a developer. Whether you develop for web or mobile, you’ll need the necessary hands-on experience and guidance in order to grow as a computer programmer — which is something that’s not easy to grasp if you head down the self-taught route. For an industry that can turn on a dime, there’s no skill more valuable than knowing how to roll with the latest technological advancement.
Join Dave Paola , Cofounder and CTO of Bloc , and Prasid Pathak , Bloc’s Director of Marketing, in this insightful webinar hosted by VentureBeat’s Wendy Schuchart.
This webinar will dive into the benefits of enrolling in a coding bootcamp and why it may be your best chance for entering the tech business, elaborate on online and one-on-one mentor led programs — and the career support that’s provided — as well as show you how to present your ideas to tech companies so they’ll be embraced and implemented. Soon, the alluring fantasy of becoming a computer programer will transform from pipe dream to reality.
Don’t miss out! Register here for free.
In this webinar, you’ll: Gain insight into what skills you’ll need to get ahead in the tech field Learn how to leverage your great idea into the next billion dollar project Research the next step for your future in coding, whether you want to build apps, games or mobile-driven on-demand services.
Speakers: Dave Paola, Cofounder & CTO, Bloc Prasid Pathak, Director of Marketing, Bloc Moderator: Wendy Schuchart, Analyst, VentureBeat This webinar is sponsored by Bloc.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,563 | 2,014 |
"Aptible is a one-stop powerhouse of health data-privacy compliance | VentureBeat"
|
"https://venturebeat.com/business/aptible-is-a-one-stop-powerhouse-of-health-data-privacy-compliance"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Aptible is a one-stop powerhouse of health data-privacy compliance Share on Facebook Share on X Share on LinkedIn With hospital chain Community Health Networks becoming a victim of major data theft yesterday, the problem of securing health data is serious.
Healthcare organizations really aren’t very good at setting up strong security against attacks, as the FBI has pointed out. This might also apply to the thousands of digital health companies that will be handling and storing health data in the future.
Aptible is a small three-person startup (one lawyer and two engineers) that helps digital health companies ready themselves to handle sensitive health data that’s entrusted to them by their health provider or health insurer clients.
The company provides an application deployment platform that helps digital health developers build privacy-compliant features into their apps and services. It provides web servers, app servers, databases, load balancers, network security, backups, encryption, and permissions. The developer then uses their choice of development tools to build the core app or service within a framework that ensures that privacy features are built-in and documented.
Digital health startups deal with a lot of anxiety around privacy. First of all, they can’t get a contract with a large medical group or hospital group if they don’t have an airtight plan for protecting health data. For the client, a legitimate concern is that a digital health vendor could compromise or mishandle the data, which could result in a lawsuit in which both parties would be named.
“Protecting the privacy of health care data is a complex undertaking, and it’s mandatory,” Aptible co-founder and chief executive Chas Ballew told VentureBeat. Ballew explained that within platforms or apps that manage health data, many different moving parts that can impact security and privacy. “We’re trying to reduce the number of moving parts for developers,” he said.
Aptible’s approach is to address the problem on multiple fronts. Apart from the technology platform, Aptible provides consulting services to clients to address specific issues. It’s also able to impart best practices that apply to all its clients. For instance, Aptible advises clients to have at least two people look at any piece of code before it goes live.
There’s also an insurance element. Aptible itself is covered by a professional liability policy, and it works with an insurer to indemnify its clients against damages from data breaches.
While digital health companies understand the importance of privacy, it’s not their core interest. They want to spend as much time as possible working on the core functions and their product’s defining features. “Every engineer has a limited amount of time to manage security,” Ballew said. “We help them by limiting the amount of things they have to be aware of.” Aptible’s clients pay $3,500 per month with a year contract for consulting services and to use the technology platform. Aptible has been in business only for a few months and has already booked $300,000 of recurring revenue contracts, Ballew said.
The startup is currently helping digital health vendors build compliant products, but Ballew said his company is now talking to healthcare providers and insurers to help them secure health data.
Aptible is part of accelerator Y Combinator’s current class, which is graduating today after the Demo Day event in Mountain View. Another accelerator, Rock Health, just announced that it had added Aptible to its portfolio of digital health startups.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,564 | 2,013 |
"More collaboration, more standards on deck for big data, Pivotal exec says | VentureBeat"
|
"https://venturebeat.com/big-data/more-collaboration-more-standards-on-deck-for-big-data-pivotal-exec-says"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages More collaboration, more standards on deck for big data, Pivotal exec says Share on Facebook Share on X Share on LinkedIn Scott Yara, the senior vice president of products at Pivotal, speaks at the 2013 DataBeat/Data Science Summit.
Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
REDWOOD CITY, Calif. — Pivotal executive Scott Yara takes heart from companies laying out billions for big data and analytics technologies.
“I haven’t seen a healthier funding environment for these kinds of technologies ever, which is really neat to see,” Yara, Pivotal’s senior vice president of products, said in his keynote at VentureBeat’s DataBeat/Data Science Summit event today.
The glory days of big data are here, and we’re perhaps a decade into the first wave of the resurgence of data and analytics, said Yara, a founder of Greenplum, a big data analytics company that EMC acquired in 2010.
Yara is looking forward to the future of data science. He has a few predictions of trends that he expects to take hold in the near future: Strong standards should emerge. “The role of standards have kind of come and gone over the last 20 years,” Yara said, mentioning the IEEE and W3C as examples of governing bodies. Standards are needed, particularly as more companies are commercializing open-source technologies.Hadoop distribution vendors, hosted database providers, and even Yara’s own company deal in this sort of strategy. “Our most critical technology investments are all based on open-source technologies,” Yara said. Standards can help make sure that anyone can contribute to open-source projects, Yara said.
Collaboration across firms, countries, and other groups will become more important across companies. That way, there will be less unnecessary duplication of efforts, with employees at many companies collecting data in silos.”We think collaboration is going to be this huge new opportunity for all of us around data,” he said. “There needs to be a common point to sort of collaborate around data.”At the same time, Yara said he’s interested in seeing how collaboration can get more popular without sacrificing privacy.
Software will become even more critical. Pivotal executives have been vocal on how data and analytics can help developers build better applications, and Yara showed his belief in this during his keynote. Data science and applications are “really hitched together forever,” he said.Reworking applications based on insights that come from data is part of what Yara called “a closed loop of innovation” — just the sort of thing that resounds with enterprises that want to move as fast as startups. The faster companies cycle through application development, collection of data, and analysis of data, the faster innovation, disruption, and problem solving can happen, he said.
Overall, Yara foresees big data playing a larger role for role in business technology.
“It’s a very, very unsettling time for most enterprise IT organizations,” he said. “They know that they’re under huge change. … How do you innovate while maintaining cost controls actually, acquiring talent, know-how, and methodologies, where you can do all these things?” Those are hard problems, but these are problems Pivotal is trying to solve.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,565 | 2,022 |
"Bunnyshell raises $4M to auto-emulate production environments for developers | VentureBeat"
|
"https://venturebeat.com/automation/bunnyshell-raises-4m-to-auto-emulate-production-environments-for-developers"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Bunnyshell raises $4M to auto-emulate production environments for developers Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Romanian startup Bunnyshell , which offers an environment-as-a-service (EaaS) platform, today announced it has raised $4 million in funding to simplify development challenges for enterprises.
For years, developers building cloud-native projects have struggled with testing and staging roadblocks. They have to spend hours creating environments that emulate complex production setups and then give more time to keep those environments updated and in sync to avoid defects that could lead to bottlenecks in merging the code (being developed) and refactoring. A slight variance in the replicas can result in cases of drifting in production and hit the development cycle.
Bunnyshell automatically spins production replicas Founded in 2018, Bunnyshell solves this challenge by automatically spinning up new always-updated, ephemeral environments for development, testing, demo or the deployment of applications in customers’ clouds. This can be anything from the simplest static websites to the most complex applications built using microservices with many cloud-native dependencies.
“With every pull request, Bunnyshell automatically creates an environment. Everything is parameterized so developers have pointers to that environment and can run manual or automated testing. We then automatically delete the environment when it’s not needed. We also auto-update the environment whenever there’s a change so the developer doesn’t have to do anything,” Alin Dobra, cofounder and CEO of Bunnyshell, told Venturebeat.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The company claims that the solution can create production replicas no matter how complex the architecture is and what services are being used. All a developer has to do is connect to their repository and cloud provider and it automatically discovers the services, both internal and external, creating development and staging environments needed. It then tracks source code changes and triggers defined at the project level to automatically update existing environments or build new ephemeral environments for every pull request.
“…we use multiple cloud providers such as AWS , Azure, GCP and Digital Ocean and integrate with multiple cloud services for both our production and staging environments. We leverage the power of multiple CI/CD providers that manage our Terraform modules, Helm charts, and deployment on Kubernetes,” Dobra explained.
This eventually enables developers to develop and test their code as close to production as possible and deliver higher quality code faster and more frequent releases, with fewer rollback requests.
Growth In the last year alone, Bunnyshell claims to have grown 700% with “dozens” of customers coming in from different countries.
“Environments-as-a-service is a new category so most of our customers were not even looking for a platform to help them release faster but were considering building internal tools that consume a lot of resources and time. The challenge is that it can take a bunch of DevOps engineers and a few months to achieve what they can do with Bunnyshell in days,” the CEO noted.
While not many players directly go against the company, many are racing to accelerate development and update cycles for enterprises in their own way. Just recently, Altogic , which handles backend development, and Zipy , focusing on improving debugging tactics, both raised seed funding. Meanwhile, California-based software reliability platform Last9 raised $11 million in series A funding.
With this round of funding, which was led by Early Game Ventures (EGV) and RocaX Ventures, Bunnyshell will focus on expanding its footprint in the U.S. market, scaling its sales and marketing teams and introducing critical product improvements.
“Security is a priority for us and we are already in the process of getting SOC2 certification. We also plan on allowing users to granularly define role-based access for each member of their teams and also integrate their private Kubernetes clusters with Bunnyshell while remaining compliant with their security certifications,” Dobra said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,566 | 2,022 |
"How Altogic simplifies app development for enterprises | VentureBeat"
|
"https://venturebeat.com/apps/how-altogic-simplifies-app-development-for-enterprises"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How Altogic simplifies app development for enterprises Share on Facebook Share on X Share on LinkedIn Instanbul headquartered Altogic has raised $1 million in seed funding to help enterprises build and deploy mobile/web applications faster.
Every company today has the ability to build applications for a range of business-to-business (B2B) and business-to-consumer (B2C) use cases. The technologies have evolved continuously, but even with all the novel capabilities in hand, the task of building a production-ready and scalable app remains long, complex, and expensive. Developers have to deal with many complexities on the backend and frontend and often end up delivering their projects late or half-baked — much to the dissatisfaction of their company and its customers.
Altogic automates backend development To tackle this challenge and improve enterprises’ time to market, Altogic provides a backend-as-a-service platform that handles key tasks associated with the backend infrastructure of an application.
“With Altogic, we provide the set of pre-integrated tools and cloud infrastructure that remove a considerable amount of mundane and repetitive tasks from developers, help them start building products in minutes and deploy them in seconds. Plus, with its no-code capabilities, the solution allows people without a background in programming to develop backend aspects,” Umit Cakmak, the founder and CEO of the platform, told VentureBeat.
The backend of an application includes several elements, starting from the app server, database and cache to business logic, job execution and session management. Altogic handles a majority of these through a three-step process.
“You first define the data model of your application. The data model defines what will be your key data entities in your app database, what kind of data fields will each entity hold, how these data entities will be related to each other, and finally, what will be the validation rules to run on input data before committing them to your app database. Then, you create your application endpoints (e.g., RESTful API endpoint) and link each endpoint with a cloud function (aka service),” Cakmak explained.
In Altogic, endpoints are the communication channels to access the cloud functions of applications and are responsible for exposing application services and data to the outside world. Cloud functions, meanwhile, are defined graphically using nodes, which are the basic service execution units that perform actions on input data and create output.
Once these steps are completed, the developer just has to create their execution environment and deploy the app.
“An environment is a space where your application data is stored and managed and your application’s RESTful endpoints are called. In Altogic, the application designs that you have created in steps one and two are all versioned through snapshots. After creating an environment, you deploy a snapshot of your app to the execution environment. You can have several execution environments (e.g., development, test, production) and deploy different snapshots of your app design to these environments. At this stage, you can integrate your Altogic backend to your frontend app using Altogic’s client API or using any HTTP client library (e.g., axios, fetch),” the CEO added.
Competition Since launching the platform in beta, Altogic has roped in about 500 developers, both enterprises and freelancers, to build apps using the platform and guide its development. However, they are not the only ones in this space. Google’s Firebase , Amazon’s Amplify and open source alternatives such as Supabase, AppWrite , and Nhost are looking to simplify app development for enterprises.
Cakmak, however, says their product stands out from the crowd as it makes coding optional for developers and gives them a way to develop apps graphically.
“Developers can use built-in or marketplace nodes or even create their own custom nodes and connect these nodes with connectors to define their cloud functions through simple drag & drop operations. This approach brings the best of both worlds, the speed of no-code to quickly develop business logic and integrations and the flexibility of coding to solve complex problems,” he said.
With the fresh round of funding, led by ScaleX Ventures, Altogic plans to grow its engineering team and accelerate product development to make its solution generally available to developers worldwide.
“We will soon release two new products to further enhance the developer experience and add real-time capabilities to the platform so that our users can develop near real-time apps using WebSockets. We will also expand our cloud infrastructure to new regions,” Cakmak said. Globally, the backend-as-a-service space is expected to grow from $1.6 billion in 2020 to nearly $8 billion by 2027.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,567 | 2,021 |
"What OpenAI and GitHub’s 'AI pair programmer' means for the software industry | VentureBeat"
|
"https://venturebeat.com/ai/what-openai-and-githubs-ai-pair-programmer-means-for-the-software-industry"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages What OpenAI and GitHub’s ‘AI pair programmer’ means for the software industry Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Let the OSS Enterprise newsletter guide your open source journey! Sign up here.
OpenAI has once again made the headlines, this time with Copilot , an AI-powered programming tool jointly built with GitHub. Built on top of GPT-3 , OpenAI’s famous language model, Copilot is an autocomplete tool that provides relevant (and sometimes lengthy) suggestions as you write code.
Copilot is currently available to select applicants as an extension in Visual Studio Code, the flagship programming tool of Microsoft, GitHub’s parent company.
While the AI-powered code generator is still a work in progress, it provides some interesting hints about the business of large language models and the future directions of the software industry.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Not the intended use for GPT-3 The official website of Copilot describes it as an “AI pair programmer” that suggests “whole lines or entire functions right inside your editor.” Sometimes, just providing a function signature or description is enough to generate an entire block of code.
Meet GitHub Copilot – your AI pair programmer.
https://t.co/eWPueAXTFt pic.twitter.com/NPua5K2vFS — GitHub (@github) June 29, 2021 Working behind Copilot is a deep learning model called Codex, which is basically a special version of GPT-3 finetuned for programming tasks. The tool’s working is very much like GPT-3: It takes a prompt as input and generates a sequence of bytes as output. Here, the prompt (or context) is the source code file you’re working on and the output is the code suggestion you receive.
What’s interesting in all of this is the unexpected turns AI product management can take. According to CNBC : “…back when OpenAI was first training [GPT-3], the start-up had no intention of teaching it how to help code, [OpenAI CTO Greg] Brockman said. It was meant more as a general purpose language model [emphasis mine] that could, for instance, generate articles, fix incorrect grammar and translate from one language into another.” General-purpose language applications have proven to be very hard to nail.
There are many intricacies involved when applying natural language processing to broad environments. Humans tend to use a lot of abstractions and shortcuts in day-to-day language. The meaning of words, phrases, and sentences can vary based on shared sensory experience, work environment, prior knowledge, etc. These nuances are hard to grasp with deep learning models that have been trained to grasp the statistical regularities of a very large dataset of anything and everything.
In contrast, language models perform well when they’re provided with the right context and their application is narrowed down to a single or a few related tasks. For example, deep learning–powered chatbots trained or finetuned on a large corpus of customer chats can be a decent complement to customer service agents, taking on the bulk of simple interactions with customers and leaving complicated requests to human operators. There are already plenty of special-purpose deep learning models for different language tasks.
Therefore, it’s not very surprising that the first applications for GPT-3 have been something other than general-purpose language tasks.
Using language models for coding Shortly after GPT-3 was made available through a beta web application programming interface, many users posted examples of using the language model to generate source code. These experiments displayed an unexplored side of GPT-3 and a potential use case for the large language model.
AI INCEPTION! I just used GPT-3 to generate code for a machine learning model, just by describing the dataset and required output.
This is the start of no-code AI.
pic.twitter.com/AWX5mZB6SK — Matt Shumer (@mattshumer_) July 25, 2020 And interestingly, the first two applications that Microsoft, the exclusive license holder of OpenAI’s language models , created on top of GPT-3 are related to computer programming. In May, Microsoft announced a GPT-3-powered tool that generates queries for its Power Apps.
And now, it is testing the waters with Copilot.
Neural networks are very good at finding and suggesting patterns from large training datasets. In this light, it makes sense to use GPT-3 or a finetuned version of it to help programmers find solutions in the very large corpus of publicly available source code in GitHub.
According to Codepilot’s homepage, Codex has been trained on “a selection of English language and source code from publicly available sources, including code in public repositories on GitHub.” If you provide it with the right context, it will be able to come up with a block of code that resembles what other programmers have written to solve a similar problem. And giving it more detailed comments and descriptions will improve your chances of getting a reasonable output from Codepilot.
? I just got access to @github Copilot and it's super amazing!!! This is going to save me so much time!! Check out the short video below! #GitHubCopilot I think I'll spend more time writing function descriptions now than the code itself :D pic.twitter.com/HKXJVtGffm — abhishek (@abhi1thakur) June 30, 2021 Generating code vs understanding software According to the website, “GitHub Copilot tries to understand [emphasis mine] your intent and to generate the best code it can, but the code it suggests may not always work, or even make sense.” “Understand” might be the wrong word here. Language models such as GPT-3 do not understand the purpose and structure of source code. They don’t understand the purpose of programs. They can’t come up with new ideas, break down a problem into smaller components, and design and build an application in the way that human software engineers do.
By human standards, programming is a relatively difficult task (well, it used to be when I was learning in the 90s). It requires careful thinking, logic, and architecture design to solve a specific problem. Each language has its own paradigms and programming patterns. Developers must learn to use different application programming interfaces and plug them together in an efficient way. In short, it’s a skill that is largely dependent on symbol manipulation, an area that is not the forte of deep learning algorithms.
Copilot’s creators acknowledge that their AI system is in no way a perfect programming companion (I don’t even think “pair programming,” is the right term for it). “GitHub Copilot doesn’t actually test the code it suggests, so the code may not even compile or run,” they warn.
GitHub also warns that Copilot may suggest “old or deprecated uses of libraries and languages,” which can cause security issues. This makes it extremely important for developers to review the AI-generated code thoroughly.
How important is @github and @OpenAI 's new Copilot AI-powered code generator, and how does it fit in with modern NLP? ? A thread… pic.twitter.com/IyvuUb5hMU — Dale Markowitz ? (@dalequark) June 30, 2021 So, we’re not at a stage to expect AI systems to automate programming. But pairing them with humans who know what they’re doing can surely improve productivity, as Copilot’s creators suggest.
And since Copilot was released to the public, developers have posted all kinds of examples ranging from amusing to really useful.
“If you know a bit about what you’re asking Copilot to code for you, and you have enough experience to clean up the code and fix the errors that it introduces, it can be very useful and save you time,” Matt Shumer, co-founder and CEO of OthersideAI, told TechTalks.
#GitHubCopilot can write SQL for you! Watch it write a function to get all admin users in a database.
This is going to save me SO. MUCH. TIME.
@OpenAI @github pic.twitter.com/Lt2KDVCWnk — Matt Shumer (@mattshumer_) June 30, 2021 But Shumer also warns about the threats of blindly trusting the code generated by Copilot.
“For example, it saved me time writing SQL code, but it put the database password directly in the code,” Shumer said. “If I wasn’t experienced, I might accept that and leave it in the code, which would create security issues. But because I knew how to modify the code, I was able to use what Copilot gave me as a starting point to work off of.” The business model of Copilot In my opinion, there’s another reason for which Microsoft started out with programming as the first application for GPT-3. There’s a huge opportunity to cut costs and make profits.
According to GitHub, “If the technical preview is successful, our plan is to build a commercial version of GitHub Copilot in the future.” There’s still no information on how much the official Copilot will cost. But hourly wages for programming talent start at around $30 and can reach as high as $150. Even saving a few hours of programming time or giving a small boost to development speed would probably be enough to cover the costs of Copilot. Therefore, it would not be surprising if many developers and software development companies would sign up for Copilot once it is released as a commercial product.
“If it gives me back even 10 percent of my time, I’d say it’s worth the cost. Within reason, of course,” Shumer said.
Language models like GPT-3 require extensive resources to train and run. And they also need to be regularly updated and finetuned, which imposes more expenses on the company hosting the machine learning model. Therefore, high-cost domains such as software development would be a good place to start to reduce the time to recoup the investment made on the technology.
“The ability for [Copilot] to help me use libraries and frameworks I’ve never used before is extremely valuable,” Shumer said. “In one of my demos, for example, I asked it to generate a dashboard with Streamlit, and it did it perfectly in one try. I could then go and modify that dashboard, without needing to read through any documentation. That alone is valuable enough for me to pay for it.” With #GitHubCopilot , you can generate a functional dashboard just by telling the AI what it should include! One comment -> dashboard! cc @OpenAI @github pic.twitter.com/1JECh6F1nb — Matt Shumer (@mattshumer_) June 30, 2021 Automated coding can turn out to be a multi-billion-dollar industry. And Microsoft is positioning itself to take a leading role in this nascent sector, thanks to its market reach (through Visual Studio, Azure, and GitHub), deep pockets, and exclusive access to OpenAI’s technology and talent.
The future of automated coding Developers must be careful not to mistake Copilot and other AI-powered code generators for a programming companion whose every suggestion you accept. As a programmer who has worked under tight deadlines on several occasions, I know that developers tend to cut corners when they’re running out of time (I’ve done it more than a few times). And if you have a tool that gives you a big chunk of working code in one fell swoop, you’re prone to just skim over it if you’re short on time.
On the other hand, adversaries might find ways to track vulnerable coding patterns in deep learning code generators and find new attack vectors against AI-generated software.
New coding tools create new habits (many of them negative and insecure). We must carefully explore this new space and beware the possible tradeoffs of having AI agents as our new coding partners.
Ben Dickson is a software engineer and the founder of TechTalks, a blog that explores the ways technology is solving and creating problems.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,568 | 2,023 |
"TruEra launches free tool for testing LLM apps for hallucinations | VentureBeat"
|
"https://venturebeat.com/ai/truera-launches-free-tool-for-testing-llm-apps-for-hallucinations"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages TruEra launches free tool for testing LLM apps for hallucinations Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
TruEra , a vendor providing tools to test, debug and monitor machine language (ML) models, today expanded its product portfolio with the launch of TruLens, open-source software dedicated to testing applications built on large language models (LLMs) like the GPT series.
Available starting today for free, TruLens provides enterprises with a quick and easy way to evaluate and iterate on their LLM applications and eliminate the chances of hallucination and bias in the production stage.
Currently, only a limited number of vendors offer tools to tackle this aspect of LLM app development, even as enterprises across sectors continue to explore the potential of generative AI for different use cases.
Why TruLens for LLM applications? LLMs are all the rage, but when it comes to building applications based on these models, companies have to go through a tiring experimentation process that involves human-driven response scoring. Essentially, once the first version of an app is developed, teams have to manually test and review its answers, adjust prompts, hyperparameters and models, and then re-test over and over until a satisfactory result is achieved.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! This not only takes a lot of time but is difficult to scale up.
With TruLens, TruEra is addressing this gap by introducing a programmatic method of evaluation called “feedback functions.” As the company explains, a feedback function scores the output of an LLM application for quality and efficacy by analyzing both the text generated from the LLM and the response’s metadata.
“Think of it as a way to log and assess direct and indirect feedback about the performance and quality of your LLM app. This helps developers to create credible and powerful LLM apps faster. You can use it for a wide variety of LLM use cases, like chatbot question answering, information retrieval and so on,” Anupam Datta, cofounder, president and chief scientist at TruEra, told VentureBeat.
TruLens can be added to the development process with a few lines of code. Once it’s up and running, users can create their own feedback functions — customized to specific use cases — or use the out-of-the-box options.
Currently, the software provides feedback functions that test for truthfulness, question-answering relevance, harmful or toxic language, user sentiment, language mismatch, response verbosity, and fairness and bias. Moreover, it also logs how much an LLM is being pinged within the app, giving an easy way to track usage costs.
“This helps you to also determine how to build the best version of the app at the lowest ongoing cost. All of those pings add up,” Datta noted.
Other offerings for LLM applications While testing LLM-driven applications for performance and response accuracy is the need of the hour, only a handful of players have launched solutions to deal with it. These include Datadog’s OpenAI model monitoring integration, Arize’s Pheonix solution , and Israel-based Mona Labs’ just-launched generative AI monitoring solution.
TruEra, for its part, claims that TruLens is best used in the development phase of LLM app development.
“This is actually the phase that most companies are in today — they are experimenting with development and really have an acute need for tools to help them iterate faster and home in on application versions that are both effective at their tasks and risk-minimizing. You can, of course, use it on both development and production models,” Datta said.
According to an Accenture survey , 98% of global executives agree that AI foundation models will play an important role in their organizations’ strategies in the next three to five years. This signals that tools like TruLens will soon see increased demand from enterprises.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,569 | 2,021 |
"Formstack: 82% of people don't know what 'no-code' means | VentureBeat"
|
"https://venturebeat.com/ai/formstack-82-of-people-dont-know-what-no-code-means"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Formstack: 82% of people don’t know what ‘no-code’ means Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Despite massive growth in no-code, only 18% of people are familiar with “no-code,” according to the Rise of the No-Code Economy report produced by Formstack, a no-code workplace productivity platform.
Above: People use no-code tools because they make it easier and faster to develop applications.
If you haven’t yet heard of no-code, you’re not alone. Despite the massive growth in no-code over the last year, 82% of people are unfamiliar with the term “no-code.” That’s according to the recent report produced by Formstack, a no-code workplace productivity platform. Yet, the survey also found that 66% of respondents have adopted no-code tools in the last year, and 41% in the last six months—suggesting developer shortages and the pandemic have driven growth across the industry.
No-code is a type of software development that allows anyone to create digital applications—from building mobile, voice, or ecommerce apps and websites to automating any number of tasks or processes—without writing a single line of code. It involves using tools with an intuitive, drag-and-drop interface to create a unique solution to a problem. Alternatively, low-code minimizes the amount of hand-coding needed, yet it still requires technical skills and some knowledge of code.
The report showed that no-code adoption and awareness is still relatively limited to the developer and tech community, as 81% of those who work in IT are familiar with the idea of no-code tools. But despite current adoption rates among those outside of IT, the report shows huge potential for non-technical employees to adopt no-code. Formstack’s survey found that 33% of respondents across industries expect to increase no-code usage over the next year, with 45% in computer & electronics manufacturing, 46% in banking and finance, and 71% in software.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! In addition to growing adoption rates, many respondents believe their organizations and industries will begin hiring for no-code roles within the next year, signaling that the growing no-code economy will create demand in an entirely new industry segment.
The survey polled 1,000 workers across various industries, job levels, and company sizes to gather comprehensive data on the understanding, usage, and growth of no-code. External sources and data points are included throughout the report to further define and explain the no-code economy.
Read Formstack’s full Rise of the No-Code Economy.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,570 | 2,023 |
"Deepset raises $30M to help enterprises unlock the value of LLMs | VentureBeat"
|
"https://venturebeat.com/ai/deepset-raises-30m-to-help-enterprises-unlock-the-value-of-llms"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Deepset raises $30M to help enterprises unlock the value of LLMs Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Germany-based Deepset , a startup that helps enterprises unlock the value of large language models (LLMs) in their workflows, today announced $30 million in a fresh round of funding. The company said it will use the capital to further develop its commercial offering, deepset Cloud, with new capabilities, including optimizations for virtual private cloud (VPC) setups and a focus on the observability side of things.
The investment has been led by Balderton Capital with participation from existing company investors GV (Google Ventures), Harpoon, System.One and Lunar. It takes the total capital raised by Deepset to $46 million.
Previous backers of the company include Snorkel AI’s Alex Ratner, Deepmind’s Mustafa Suleyman, Cockroach Labs’ Spencer Kimball, Cloudera’s Jeff Hammerbacher and Neo4j’s Emil Eifrem.
The development comes at a time when companies across sectors and around the world are looking to leverage the power of large language models in their internal systems to better grapple with the challenge of growing data volumes and make their teams function more efficiently.
How does Deepset provide LLM support? By 2025, the global datasphere is expected to grow to 163 zettabytes.
Of this, the data from enterprise systems will be nearly 60%, meaning teams will have a mountain of data to deal with. This will make searching, retrieving, summarizing and analyzing relevant, work-related information quite a task.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Now, while large language models can be trained to develop linguistic intuition and semantics, using them to address this challenge in business applications can be quite a task, especially for non-AI companies. This is where Deepset comes in.
The company offers an open-source framework — dubbed Haystack — that allows developers to choose components required for a modern NLP project, from proprietary and open-source LLMs and vector databases to file converters and text embedding models. Once the components are picked, the framework plugs them into pipelines or agents to build LLM-driven applications.
Such an application could be anything from a Google-like search engine for company documents to conversational AI to a powerful internal helpdesk.
Deepset started off five years ago with bespoke NLP solutions and followed those efforts with the launch of Haystack in 2019. Last year, it expanded its portfolio with deepset Cloud, a model-agnostic, SOC 2-certified cloud platform that lets AI teams build customized and flexible LLM systems while retaining full ownership of their data. This is the company’s commercial offering.
As the company explains, deepset Cloud covers the entire life cycle of a modern NLP application — experimentation, production and observability — while making it easy to compare and exchange different language models. It gives stakeholders a unified environment to work with fast prototyping, frequent feedback cycles and easy customization.
“Enterprises can see huge benefits from leveraging LLM technology. At Deepset, we’re providing a platform that helps to bridge the decades of research in machine learning and computer science into production-ready applications. In the same way you don’t need to know much about microchip architecture to write software, you don’t need to be an NLP or LLM scientific researcher to use our Haystack framework and deepset Cloud,” Milos Rusic, cofounder of Deepset, said in a statement.
The 50-strong company works with enterprises across the U.K., Europe and the U.S. Legal publishing house Manz, for example, was able to use deepset Cloud for developing LLM-enabled products aimed at helping find precedents, relevant regulations, templates and more from millions of documents.
Meanwhile, aircraft maker Airbus’s R&D team is using Haystack to build an application that helps pilots discover and use the most relevant aircraft operation guidelines right from the cockpit. The open-source framework has seen a 250% increase in active users, according to the company.
Plan to do more With this round of funding, the company plans to continue international expansion and develop its products with new capabilities.
“We’ll do this by refining deepset Cloud for RAG ( retrieval -augmented generation) applications: in particular, improving the evaluation of every component in a RAG pipeline. We’ll also focus on making the platform viable for customers with heavy privacy constraints by optimizing for virtual private cloud (VPC) setups,” the company said in a blog post.
In addition, it will focus on diversifying and improving LLM observability in deepset Cloud, giving customers confidence in the performance of their LLM applications in product environments, the blog post said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,571 | 2,022 |
"How CrowdStrike consolidates tech stacks as a growth strategy | VentureBeat"
|
"https://venturebeat.com/security/how-crowdstrike-consolidates-tech-stacks-as-a-growth-strategy"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages How CrowdStrike consolidates tech stacks as a growth strategy Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Driving tech stack consolidation by broadening the CrowdStrike Falcon platform is a proven strategy for driving growth, with Fal.Con 2022 proving to be an inflection point. Four new product announcements stand out as core to CrowdStrike’s strategy. They include expanding cloud-native application protection platform (CNAPP) capabilities for CrowdStrike Cloud Security , including cloud infrastructure entitlement management (CIEM) and integration of the CrowdStrike Asset Graph; Falcon Insight XDR ; Falcon Complete LogScale ; and Falcon Discover for IoT.
96% of CISOs plan to consolidate their security platforms, with 63% saying extended detection and response (XDR) is their top solution choice.
Cynet’s 2022 survey of CISOs found that nearly all CISOs have consolidation on their roadmaps, up from 61% in 2021. CISOs believe consolidating their tech stacks will help them avoid missing threats (57%) and reduce the need to find qualified security specialists (56%), while streamlining the process of correlating and visualizing findings across their threat landscape (46%).
Gartner predicts that by 2025 [subscription required], 50% of midmarket security buyers will rely on XDR to accelerate the consolidation of workspace security technologies, including endpoint, cloud application and identity security.
XDR is a consolidation engine During his keynote, George Kurtz, CrowdStrike’s cofounder and CEO, provided insights into why XDR is such a high priority for its platform. He said, “80% of the security data you get the most value from [are] the endpoints and the workloads. That’s really where the attacks are. Yes, they happen across the network and other infrastructure. But the reality is [that] people are exploiting endpoints and workload.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Ingesting and managing security data needs to start with a focused, intentional purpose, a point Kurtz made several times during his keynote. XDR’s core value is providing an integrated platform of threat detection, incident response and remediation with real-time monitoring and visibility of cloud platforms, apps, endpoints and networks, including remote sensors.
During his keynote, Kurtz defined XDR as being “built on the foundation of endpoint detection and response (EDR). XDR extends enterprise-wide visibility across all key security domains (native and third-party) to speed and simplify real-time detection, investigation and response for the most sophisticated attacks.” XDR is so core to the future of CrowdStrike that every keynote provided a glimpse of how and where it will be designed to deliver value. “We’re excited that we can democratize XDR for all of our customers,” Kurtz said during his keynote.
Acquiring Reposify accelerates consolidation Protecting internal attack surfaces is a challenge that even the most advanced ITops and secops teams constantly deal with. It’s because internal threats can strike at the heart of an identity access management (IAM) or privileged access management (PAM) system using stolen credentials and take control of servers in as little as an hour and 24 minutes, according to CrowdStrike’s 2022 Global Threat Report.
Internal attacks are among the most difficult to identify and stop.
CrowdStrike’s acquisition of Reposify brings an integrated external attack surface management platform onto Falcon. Reposify scans the web daily for exposed assets, giving enterprises visibility over their exposed assets and defining which actions they need to take to remediate them. Additionally, CrowdStrike announced plans to use Reposify’s technology to help its customers stop internal attacks as well.
“Reposify is a powerful external attack surface management platform. It scans the internet for vulnerabilities and exposes assets to identify and eliminate risk across your organization,” Kurtz said during his keynote. But, he added, “there’s no reason we can’t use it internally to continue to help you understand your risks inside, to continue to help you find those exposed assets.” Reposify’s platform has proven successful in helping secops and ITops teams find unknown exposed assets, identifying shadow IT and internal threat risks in real time before attackers breach infrastructure. It solves an issue many CISOs are facing today: getting more in control of external threats while strengthening the argument for consolidating on a single platform.
Why the CrowdStrike consolidation strategy works The ongoing shortage of security engineers combined with tighter IT and security budgets make selecting best-of-breed security apps a tough sell for many CISOs. Meanwhile, cyberattackers are out-automating many organizations, devising malware-free techniques to avoid detection.
Gartner [subscription required] found that 85% of organizations currently pursuing a vendor consolidation strategy show a flat or increased number of vendors in the past year.
Cybersecurity platforms provide economies of scale, drive a strong network effect across any company’s ecosystem, and force security providers to make customer success a core strength. Getting customer success right combined with the labor shortage and skyrocketing inflationary prices of running a business all work in CrowdStrike’s favor from a consolidation-strategy standpoint. It’s common knowledge that even if a best-of-breed vendor is integrated into a tech stack, CISOs are adamant that the contract is just for one year in case the system doesn’t deliver the expected value.
No CISO wants to hear that they have to hire a new engineer just for a new app. Secops teams are short-staffed already, with team members often having multiple assignments. Having one person own a new best-of-breed app means they have to spend time learning it while doing their current job.
Conversely, most secops teams have dedicated platform engineers who specialize in core platforms and infrastructure their organization needs to operate. CrowdStrike’s approach to making each of its 22 modules adhere to UX and workflow standards is very similar to Salesforce’s approach of defining a common user experience and having all partners and internal devops teams build to it.
Kurtz mentioned during his keynote that he often hears the company is known as the Salesforce of security due to its reliance on cloud architecture. Cloud architectures bring greater UX and UI flexibility, making API integration possible with legacy on-premises systems.
Additionally, CrowdStrike’s devops discipline is clear from the announcements at Fal.Con 2022, and the company’s product leaders take pride in how fast they can iterate on the platform. CrowdStrike’s reliance on the cloud helps speed up land-and-expand selling strategies in enterprises. Selling lower total cost of ownership and providing bundling options and pricing is how CrowdStrike turns consolidation into recurring revenue growth.
IAM and PAM are due for consolidation With secops teams overwhelmed and cyberattackers looking to breach IAM and PAM systems to take control of servers full of identities and privileged access credentials, there’s room for consolidation in this market. Added to the urgency is how fast machine identities are growing, including the need to secure ephemeral containers.
Organizations whose PAM and IAM systems are siloed today risk experiencing a breach and not knowing it. Many must improve their IAM infrastructure, updating systems to current standards while improving security best practices, including credential management and hardening security for Active Directory (AD).
Most importantly, consolidation of this market area would improve real-time monitoring of identity attack techniques while improving security access controls. In short, IAM and PAM would achieve the real-time visibility those systems need to stay secure while capitalizing on threat intelligence enterprise-wide, delivering a substantial benefit of choosing to consolidate on a single platform.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,572 | 2,022 |
"CrowdStrike's platform plan at Fal.Con melds security and observability | VentureBeat"
|
"https://venturebeat.com/security/crowdstrikes-platform-plan-at-fal-con-melds-security-and-observability"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages CrowdStrike’s platform plan at Fal.Con melds security and observability Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Cybersecurity platforms need to do a better job closing the data gaps between IT and security to deliver on their potential to drive growth.
CrowdStrike is up for that challenge, as their many announcements at Fal.Con 2022 prove.
“Adding security should be a business enabler. It should be something that adds to your business resiliency, and it should be something that helps protect the productivity gains of digital transformation,” said George Kurtz, CrowdStrike’s cofounder and CEO, during his keynote address at the conference.
Kurtz continued, saying the company is “leveraging security to turn it into the center of your digital transformation.
And protecting your productivity and your future” is a core focus of the company going forward.
Workload protection, identity-threat protection and the company’s continued emphasis on data dominated the keynote.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Eighty percent of the attacks, or the compromises that we see, use some sort of some form of identity, credential theft,” Kurtz said.
He also announced that CrowdStrike is acquiring Reposify and making strategic investments in Salt Security and Vanta through CrowdStrike’s strategic investment vehicle, Falcon Fund.
“Reposify scans the internet daily for exposed assets, enables enterprises to have visibility over these exposed assets and take action to remediate,” Kurtz said.
Additionally, he explained that Reposify’s best-in-class scanning engine would enhance CrowdStrike’s capabilities across the Falcon platform and strengthen the core areas of EASM, Falcon Discover, Falcon Spotlight, Falcon Horizon and Threat Intelligence.
CrowdStrike CEO: Security and observability need to converge CrowdStrike intends to lead the industry in merging security and data, threat intelligence and telemetry. During the keynote, Kurtz explained how Falcon LogScale and Falcon Complete LogScale, two new products announced at Fal.Con, are designed to provide real-time observability , actionable insights, search data with sub-second latency and telemetry data for the CrowdStrike Threat Graph and Asset Graph tools.
“When we think about driving this convergence, and of security and observability, it really is about secops and ITops coming together.” Kurtz said. “And … if we can ingest at scale, we’re going to provide rich information for not only the security team, but also the IT team,” he said.
Kurtz’s keynote defined the company’s vision predicated on its core strengths of endpoint security, cloud security, threat intelligence and identity protection, integrating ITops and secops with observability. He said the company is focused on democratizing extended detection and response (XDR) for all Falcon platform customers by building on those strengths.
“We’re really excited that we can democratize XDR for all of our customers. So if you’re a Falcon platform user, and you have Insight, obviously there’s some licensing add-ons that will be part of that to move to XDR to pull in and ingest data. But we will make that available to you through the sales organization. But we’re really excited about what we’re doing in XDR,” he said.
XDR delivers data normalization and is now a layer in the Falcon platform tech stack.
CrowdStrike devops is in overdrive Other noteworthy announcements at Fal.Con 2022 show how well the CrowdStrike devops and threat hunter teams collaborate and work toward common design goals to extend their platform.
In an interview with VentureBeat, Amol Kulkarni, chief product and engineering officer at CrowdStrike, said, “If you have the core infrastructure in the right place, then you can iterate rapidly and build out products much faster because the baseline is there. The second part there is that we have this notion of collect once and use multiple times. So, what that is based on is collecting all the telemetry in the security cloud and then put additional analytics on top for different scenarios. So, that gives us that velocity.” Expanded cloud-native application protection platform (CNAPP) capabilities One of CrowdStrike’s most ambitious projects has been adding new CNAPP capabilities for CrowdStrike Cloud Security , while also including new cloud infrastructure entitlement management (CIEM) features and the integration of CrowdStrike Asset Graph.
Scott Fanning, senior director of product management, cloud security at CrowdStrike, told VentureBeat that their approach to CIEM enables organizations to detect and prevent identity-based threats from improperly configured cloud entitlements across public cloud service providers. They do this by enforcing least-privileged access to clouds and provide continuous detection and remediation of identity threats.
Kulkarni’s keynote briefly demonstrated how CrowdStrike Asset Graph provides cloud-asset visualization and how CIEM and CNAPP can help see and secure cloud identities and entitlements. Kulkarni said the goal is to optimize cloud implementations and perform real-time point queries for rapid response. He also said combining the Asset Graph with CIEM enables broader analytical queries for asset management and security posture optimization. Finally, he demonstrated how the CrowdStrike Threat Graph provides full visibility of attacks and automatically prevents threats in real time across CrowdStrike’s global customer base.
Falcon Insight is now Falcon Insight XDR, enabling native and hybrid XDR for all customers Kurtz defined XDR during his keynote, saying it is “built on the foundation of endpoint detection and response (EDR), XDR extends enterprise-wide visibility across all key security domains (native and third-party) to speed and simplify near real-time detection, investigation and response for the most sophisticated attacks.” He also mentioned that the goal is for Falcon Insight XDR to provide all customers the opportunity to leverage the power of native and hybrid XDR as a fundamental platform capability, with no disruption to existing EDR capabilities or workflows.
CrowdStrike supports third-party telemetry from CrowdXDR Alliance partners, including Cisco, ForgeRock and Fortinet. Also supported are third-party vendors, including Microsoft (for Microsoft 365 and Azure Active Directory) and Palo Alto Networks. Falcon Insight XDR also integrates with Zscaler Zero Trust Exchange to drive response actions from XDR detections or via automated Falcon Fusion (SOAR) workflows.
Falcon platform customers who have Falcon Insight XDR and Falcon Cloud Workload Protection, Falcon Identity Threat Protection and/or Falcon for Mobile (EDR) can add the native XDR connector pack, which will be available to ensure all CrowdStrike customers can leverage the platform’s native XDR capabilities.
Falcon Discover for IoT targets security gaps in and between industrial control systems (ICS) The world’s critical infrastructure for water, power, oil and gas production and process manufacturing run on ICS systems that weren’t designed for security. As a result, ICS systems and the infrastructure facilities they support are among the most porous and poorly protected today.
Kulkarni told VentureBeat that Falcon Discover for IoT is designed to provide comprehensive visibility and continuous risk assessment across IoT and operations technology (OT) inventory.
“While visibility in an organization’s environment is important, just defining what’s present doesn’t solve the problem,” said Kulkarni. “Organizations need a security platform that can provide deep visibility into cross-domain data and an understanding of their attack surface in order to make the most informed, risk-based decisions – resulting in a more predictive and proactive security posture. With CrowdStrike driving the convergence of security and observability with the Falcon platform, organizations can do more with their data and bridge the gap between OT and IT environments, as well as IT and security operations.” Kulkarni also provided a demonstration of Falcon Discover for IoT during his keynote. Consistent with Kurtz’s keynote emphasizing greater convergence of IT and security, the Falcon Discover for IoT demo showed how intuitively customers could improve IT/OT convergence with a centralized and up-to-date inventory of all IT, OT and IoT assets. In addition, support for advanced behavioral analytics helps identify and mitigate potential risks associated with connected devices. There’s also real-time asset monitoring and 360-degree visibility of IT and OT environments that identify legacy systems and can pinpoint blind spots across networks.
A call for more cyberdefenders “I always like to leave people with that sense of obligation that we are on the front lines; if there is a modern war that impacts the nation where you’re from, you’re going to find yourself in a room during that that conflict, figuring out how to best protect your nation,” Kevin Mandia, CEO of Mandiant, said during a fireside chat with Kurtz. “I’ve been amazed at the ingenuity when someone has six months to plan their attack on your company. So, always be vigilant,” Mandia continued.
CrowdStrike’s rapid pace of development, spanning multicloud security with CNAPP to the new Asset Graph, shows how their devops team has turned iterative development into a competitive advantage. In addition, the Falcon platform has proved to be an innovation catalyst that can quickly span the fast-changing customer requirements of devops and threat hunting.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,573 | 2,022 |
"At Ignite '22 cybersecurity conference, Palo Alto Networks looks to capitalize on consolidation | VentureBeat"
|
"https://venturebeat.com/security/at-ignite-22-cybersecurity-conference-palo-alto-networks-looks-to-capitalize-on-consolidation"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages At Ignite ’22 cybersecurity conference, Palo Alto Networks looks to capitalize on consolidation Share on Facebook Share on X Share on LinkedIn Ignite22 Day 2 | Palo Alto Networks | MGM Grand | Photo © Show Ready, @showreadyphoto Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Aiming to alleviate the costs and time-drains CISOs face keeping cloud, network and security operations centers (SOCs) secure, Palo Alto Networks made a compelling case at its Ignite ’22 cybersecurity conference to consolidate security tech stacks. Like CrowdStrike , which is consolidating tech stacks as a growth strategy , Palo Alto Networks’ latest financial results, earnings call and announcements at Ignite ’22 all reflect an intensifying focus on capitalizing on consolidation.
Palo Alto’s 2022 What’s Next in Cyber survey finds that 77% of C-suite leaders say they are highly likely to reduce the number of security solutions and services they rely on. Their responses show that a typical global enterprise has an average of 31 cybersecurity apps, services and tools, and contracts with 13 different vendors. Forty-one percent of organizations are working with 10 or more cybersecurity vendors. With security budgets under greater scrutiny for the business value they deliver, CISOs need to drive revenue to advance their careers. Consolidating duplicate systems helps improve the accuracy and intelligence an integrated tech stack can provide while reducing costs and improving cybersecurity’s revenue contribution.
Selling the consolidation vision at Ignite ’22 Palo Alto Networks has created a compelling vision that puts consolidation at the core of its go-to-market strategy. “And customers are actually onto it. They want the consolidation because right now, customers are going through the three biggest transformations ever: They’re going to network security transformation, they’re going through a cloud transformation, and [though] many of them don’t know … they’re about to go to an SOC transformation,” said Nikesh Arora, Palo Alto Networks chairman and CEO, during his keynote.
Selling the benefits of consolidating cybersecurity applications and tools on a single platform is working. The company’s fiscal first-quarter revenue grew 25% yearly to $1.6 billion , and fiscal first-quarter billings grew 27% yearly to $1.7 billion. “At the center of our strategy is the need to drive more consolidation to get customers to a better security posture. Towards that end, we continue to see large cross-platform buys and grow our millionaire customers at a steady clip,” Arora said on Palo Alto’s recent earnings call.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The company relies on upselling and cross-selling from its Strata, Prisma Cloud and Cortex platforms, capitalizing on opportunities with prospects and customers to replace redundant, often legacy applications, tools and systems.
Keynotes and senior management Q&A sessions throughout Ignite ’22 reinforced this consolidation vision by emphasizing the need to secure hybrid, multi-cloud configurations, help customers deal with accelerating digital transformation, and recognize how “in five years SOCs will be run using AI ,” according to company founder and CTO Nir Zuk. The intensifying security landscape is feeding into the consolidation vision, given the proliferation of attacks and the need for better threat intelligence and more trusted partners on the platform.
Prisma and Cortex have cybersecurity momentum Ignite ’22 provided proof points of Palo Alto Networks’ intensive R&D spending on cloud security and security operations, including two of the three platforms the company relies on for its product and services revenue today. VentureBeat spoke with several CISOs, CIOs and IT leaders at Ignite ’22 to see if the build-out of Prisma Cloud and Cortex is scalable enough to handle customers’ needs beyond network security.
The security leaders told VentureBeat that Prisma’s “shift left” strategy, strengthened by the acquisition of Cider Security , along with Software Composition Analysis (SCA), is needed to provide the tools an organization needs to produce Software Bills of Materials (SBOMs).
The purpose is to comply with the White House’s Executive Order 14028 , which requires software vendors to provide an SBOM, and the requirements in the September 14, 2022 memorandum from the director of the Office of Management and Budget (OMB) to the heads of executive branch departments and agencies.
“Out-Innovating the Attackers,” the keynote by Lee Klarich, Palo Alto Networks’ chief product officer, was the best presentation at Ignite ‘22 because it showed how the company’s 4,000 devops engineers and product managers are translating urgent challenges customers face into products. An example of how effective the product organization is at innovating can be seen in Prisma Cloud’s new announcements, seen in the slide below from Lee’s presentation. SCA and the Cider Security acquisition are table stakes for securing software supply chains.
Active attack surface management (ASM) is now on the Cortex platform Xpanse Active ASM aims to help security teams not just actively find but also proactively fix their known and unknown internet-connected risks. Xpanse Active ASM equips organizations with automation to give them an edge over attackers. “While the fundamental need for attack surface management hasn’t changed, today’s threat landscape is much different. Organizations need an active defense system that operates faster than attackers can,” said Matt Kraning, chief technology officer of Cortex for Palo Alto Networks.
“As the leader and pioneer in the ASM market, we realize that customers need complete, accurate and timely discovery and remediation of risky exposures in their internet-connected systems. With Xpanse Active ASM, we give defenders the ability to see their exposures instantly and shut them down automatically, with no human labor required.” Xpanse Active ASM provides the following: Active Discovery : Attackers use frequent, automated probes to find vulnerable and exposed assets. Organizations need tools that give them the same visibility. The Active Discovery module refreshes its internet-scale database several times daily and uses supervised machine learning (ML) to map these vulnerabilities accurately. This helps an organization get an outside-in view of its network — the same view attackers have.
Active Learning : Xpanse continuously processes discovery data, mapping new data to the people responsible for each system. The Active Learning module continuously analyzes and maps the streamed discovery data to understand and prioritize top risks in real time. As a result, customers can stay ahead of attackers by closing down the riskiest exposures quickly.
Active Response : While instant discovery of vulnerabilities and exposures can give security teams a realistic risk picture, identifying issues isn’t enough. Automated remediation is key to staying ahead of attackers. It saves response time in the SOC by eliminating the manual step of creating a ticket for analysts, who must then spend hours of manual effort tracking down the owner of the affected system and resolving the vulnerability. True automation is solving the end-to-end remediation process without human intervention. Active Response includes native embedded automatic remediation capabilities that use Active Discovery data and Active Learning analysis to automatically shut down exposures before they allow threats into a network. It executes ASM-specific playbooks to triage, deactivate and repair vulnerabilities automatically.
The Xpanse Active Response module includes built-in end-to-end remediation playbooks. These playbooks automatically eliminate critical risks, such as exposed Remote Desktop Protocol (RDP) servers and insecure OpenSSH instances, without any manual labor.
Following remediation, Active Response automatically verifies that remediation was successful by scanning assets, compiling audited actions and placing investigation details into clear dashboards and reports.
The $100 billion market cap remains elusive For Palo Alto Networks to be the first cybersecurity company to reach a $100 billion market capitalization , as CEO Nikesh Arora has predicted in an interview, there are several challenges the company must first overcome to achieve competitive parity.
Most noticeable at Ignite ’22 was the need for more partners to be exhibiting Palo Alto’s solutions and greater enthusiasm for partner solutions on the part of Palo Alto Networks’ customers. To reach a $100 billion market cap, channel and technology partners must deliver more revenue globally, not just in the U.S.
Second, despite the new products that capitalize on the company’s evolving machine learning expertise as Xpanse Active ASM does, Palo Alto Networks still isn’t showing that it has AI and ML embedded in its DNA. Consider CrowdStrike’s rapid innovations in ML, with Threat Graph, Asset Graph, Falcon Discover for IoT and many products and services released just this year.
Devops is one of Palo Alto Networks’ strongest areas today, based on what was presented at Ignite ’22. To reach that $100 billion market cap, it needs to fulfill its vision of running an SOC on AI in five years or less while focusing on using ML as a devops force multiplier across all product strategies.
Partners needed Palo Alto Networks also announced a zero-trust network access (ZTNA) partnership with Google. Palo Alto’s Prisma Access will team with BeyondCorp Enterprise from Google Cloud to enable users to work together securely and seamlessly on different devices from different locations. However, this seemed to be more a validation of work the two companies have already done together than something fundamentally new.
With zero trust dominating nearly every conversation today, the lack of partner announcements was a missed opportunity to generate more interest in Palo Alto Networks’ partner base. Paradoxically, if Palo Alto opened up ZTNA sales opportunities to partners more, it could make significant gains toward its $100 billion market cap goal.
As Nikesh said in the keynote, “the only way you can get zero trust security is through Palo Alto.” Providing partners with an opportunity to profit from that strategy would energize the area of the company that needs to scale the most to reach that market cap goal.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,574 | 2,022 |
"XDR-driven security industry consolidation continues, with SentinelOne to acquire Attivo | VentureBeat"
|
"https://venturebeat.com/2022/03/15/xdr-driven-security-industry-consolidation-continues-with-sentinelone-to-acquire-attivo"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages XDR-driven security industry consolidation continues, with SentinelOne to acquire Attivo Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
With SentinelOne announcing plans to acquire Attivo Networks — coming one week after Google said it has an agreement to buy Mandiant — a recent prediction from research firm Gartner about a new wave of security industry consolidation seems to be proving itself.
On March 7, Gartner identified vendor consolidation among the top seven security and risk management trends for 2022. “Security technology convergence is accelerating, driven by the need to reduce complexity, reduce administration overhead and increase effectiveness,” Gartner said in a news release.
The very next day, one of the largest security industry acquisitions in recent memory — Google’s $5.4 billion deal to acquire security powerhouse Mandiant — was announced.
And today, another sizable acquisition is coming to light: AI-driven cybersecurity firm SentinelOne announced a $616.5 million deal to acquire identity security firm Attivo Networks, in part to bolster SentinelOne’s Singularity XDR (extended detection and response ) platform.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! What the two acquisitions have in common is that both appear aimed at delivering an XDR, or XDR-like, architecture to customers.
Focus on XDR While capabilities can vary across vendors in XDR , the overall concept is to integrate and correlate data from numerous security tools — and from across varying environments — to help customers prioritize the biggest threats.
While less than 5% of organizations are using XDR today, that’s expected to climb to 40% by 2027, according to a recent report from Gartner.
In an interview last week, Gartner’s Peter Firstbrook told VentureBeat that right now, “one of the driving factors of vendor consolidation is XDR.” XDR brings an answer to the key question of “how do I integrate all the threat intel from all these security components I bought — so that I can do a proper incident response, and the humans can make sense of those alerts very quickly?” said Firstbrook, a research vice president and analyst at Gartner In other words, XDR allows security teams to “resolve alerts quickly and move on,” he said. “Because right now, most organizations are really struggling to deal with all their alerts.
” And when it comes to XDR-driven consolidation in the security industry, “this is just the beginning of this trend,” Firstbrook said in the interview last week.
Microsoft had reportedly wanted to acquire Mandiant, before Google stepped in, “so maybe they’ll buy SecureWorks or Reliaquest or eSentire to jumpstart their program,” he said, referring to several vendors in the XDR space.
Google’s moves The shift to embracing an XDR-like architecture appears to have been among the factors behind Google’s interest in Mandiant, as well as a factor in Google’s acquisition of Siemplify in January.
“I feel this merger between Mandiant and Google Cloud allows us to be the brains behind so much of those controls that people are depending on,” Mandiant CEO Kevin Mandia said during a news conference last week. The move will bring together Mandiant’s threat intelligence and services with the Google Chronicle security analytics service and Siemplify, Mandia noted.
Chronicle and Siemplify are all about “interoperability between a ton of other technologies — [they] work with every firewall company, work with all the endpoint companies, work with logs generated from different applications,” he said.
Meanwhile, with SentinelOne’s announcement today, the focus on XDR is even more overt. The acquisition of Attivo, set to close in the quarter ended July 31, will extend the capabilities of the Singularity XDR platform “to identity-based threats across endpoint, cloud workloads, IoT devices, mobile and data wherever it resides,” SentinelOne said in a news release.
Identity threat detection Notably, another trend highlighted on Gartner’s recent list — identity threat detection and response — factors heavily in SentinelOne’s planned acquisition of Attivo as well. The term, coined by Gartner, refers to the approach of going beyond identity authentication to actually detect when identity systems have been compromised.
Identity is “the new perimeter,” said SentinelOne COO Nicholas Warner in a news release. And “ identity threat detection and response is the missing link in holistic XDR and zero trust strategies,” Warner said.
As for Google Cloud, the acquisitions are unlikely to stop with Mandiant, Forrester analysts Jeff Pollard and Allie Mellen wrote in a blog post last week. Next up on the acquisition priority list might be a solution for endpoint detection and response (EDR), the analysts said.
“Given that GCP (Google Cloud Platform) needs EDR to gain full ownership of the technologies that comprise its XDR offering, its next shopping list likely includes an EDR tool,” the analysts wrote in the blog. “GCP wants to become a top–tier cybersecurity player, and its acquisitive actions match its goals.” More broadly, the Mandiant acquisition “will have a major ripple impact across the cybersecurity space as cloud stalwarts Amazon and Microsoft will now be pressured into M&A and further bulk up its cloud platforms,” wrote Daniel Ives, managing director for equity research at Wedbush Securities, in a note to investors last week.
Wedbush believes that cybersecurity vendors including Varonis, Qualys, Tenable, Rapid7, CyberArk, SailPoint and Ping Identity stand out as candidates for a possible acquisition, given the “laser focus” these vendors bring on securing cloud workloads against attacks, Ives wrote.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,575 | 2,023 |
"Bringing order to data lakehouses, Onehouse is expanding its Apache Hudi technology with $25M raise | VentureBeat"
|
"https://venturebeat.com/data-infrastructure/bringing-order-to-data-lakehouses-onehouse-is-expanding-its-apache-hudi-technology-with-25m-raise"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Bringing order to data lakehouses, Onehouse is expanding its Apache Hudi technology with $25M raise Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Managed data lakehouse vendor Onehouse today announced that it has raised $25 million in a series A round of funding to help further advance its go-to-market and technology efforts based on the open-source Apache Hudi project.
Onehouse emerged from stealth a year ago, in Feb. 2022, as the first commercial vendor providing support and service for Apache Hudi.
Hudi, which is an acronym for Hadoop Upserts Deletes and Incrementals, traces its roots back to Uber in 2016 where it was first developed as a technology to help bring order to the massive volumes of data that were being stored in data lakes.
The Hudi technology provides a data lake table format as well as services to help with clustering, archiving and data replication. Hudi competes against multiple other open-source data lake table technologies including Apache Iceberg and Databricks Delta Lake.
The goal at Onehouse is to create a cloud-managed service that can help organizations benefit from a managed data lakehouse. Alongside the new funding, Onehouse announced its Onetable initiative that aims to enable users of Iceberg and Delta Lake to interoperate with Hudi. With Onetable, organizations can use Hudi for data ingestion into a data lake while still being able to benefit from query engine technologies that run on Iceberg — including Snowflake — as well as Databricks’ Delta Lake.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “We are really trying to build a new way of thinking about data architecture,” Onehouse founder and CEO Vinoth Chandar, told VentureBeat. “We are very convinced that people should start with an interoperable lakehouse.” Understanding the data lakehouse trend The data lakehouse is a term first coined by Databricks.
The goal of a data lakehouse is to take the best aspects of a data lake, which provides large volumes of data storage, with a data warehouse that provides structured data services for queries and data analytics. A 2022 report from Databricks identified a number of key benefits of the data lakehouse approach including improved data quality, increased productivity and better data collaboration.
A key component of the data lakehouse model is the ability to apply structure to data lakes, which is where the open-source data lake table formats, including Hudi, Delta Lake and Iceberg fit in. Multiple vendors are now building full platforms with those table formats as a foundation.
Among the many supporters of Apache Iceberg is Cloudera , which launched its data lakehouse service in August 2022.
Dremio is another strong Iceberg supporter, using it as part of its data lakehouse platform. Even Snowflake , one of the pioneers of the cloud data warehouse concept, is now supporting Iceberg.
Onetable isn’t another data lake table format At the core of the major data lake formats today, including Hudi, Delta Lake and Iceberg, are files that organizations want to be able to use for analytics, business intelligence or operations.
A challenge that has emerged, though, is that vendor technologies have been increasingly vertically integrated — combining the data storage and query engines. Kyle Weller, head of product at Onehouse, explained he’s seen organizations confused about which vendor to choose based on which data lake table format approach is supported. The Onetable approach is intended to abstract away the differences across the data lake table formats, to create an interoperability layer.
“The goal and the mission of Onehouse is about decoupling data processing data query engines from how your core data infrastructure operates,” Weller told VentureBeat.
Weller added that at the foundation of many data lakes today are files stored in the Apache Parquet data storage format. What Onetable is essentially doing is providing a metadata layer on top of Parquet that enables easy translation from one table format to another.
Where Onetable fits into the data lakehouse use case Chandar noted that Hudi provides advantages over other formats, such as transactional replication and fast data ingestion.
One potential use case where he sees the Onetable feature fitting in, is for organizations using Hudi to do massive volumes of data ingestion, but want to be able to use the data with another query engine or technology such as a Snowflake Data Cloud deployment, for some type of analytics.
Chandar said a lot of companies have data sitting in data warehouses and they are increasingly deciding to build a data lake either because of costs or because they want to start a new data science team. The first thing those organizations will do is data ingestion, bringing all their transactional data to the lake, which is where Chandar said Hudi and the Onehouse service excels.
Now with the benefit of the Onetable technology, the same organization that has ingested data into Onehouse, can also use other technologies such as Snowflake and Databricks for data queries on the data, for analytics.
Looking forward for both Hudi and the Onehouse platform, Chandar emphasized that further optimizing the ability for organizations to utilize data quickly will remain a key theme.
“We have announced in the Hudi project that we want to add a caching layer at some point,” he said. “We are thinking about anything and everything around data and how we can optimize it really well.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,576 | 2,023 |
"Google advances AlloyDB, BigQuery at Data Cloud and AI Summit | VentureBeat"
|
"https://venturebeat.com/ai/google-advances-alloydb-bigquery-at-data-cloud-and-ai-summit"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Google advances AlloyDB, BigQuery at Data Cloud and AI Summit Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Google today unleashed a series of updates to its data and AI platforms to help companies more efficiently harness the power of data and drive innovation.
The announcements, made at the virtual Google Cloud Data and AI Summit , included a new approach to running BigQuery, Google’s serverless data warehouse. The company said that BigQuery Editions would give customers more flexibility to operate and scale their data workloads. Google also unveiled data clean rooms, a service to keep data separate and anonymous.
In addition, Google launched AlloyDB Omni, a database service that handles transactions and analytics. First announced in May 2022, AlloyDB is a managed cloud database that is based on the open source PostgreSQL relational database.
Rather than just being focused on transactional workloads — which is what PostgreSQL supports by default — AlloyDB also has capabilities to support analytics workloads. To date, AlloyDB has only been available as a service running in the Google Cloud. That will change with AlloyDB Omni, which will provide organizations the ability to run the database wherever they want.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! New tools driven by customer demand Rounding out Google’s new product announced is the Looker Modeler service. Looker is a business intelligence (BI) technology that Google acquired for $2.6 billion in 2020. The Modeler service provides a new way for organizations to define and access business metrics.
In a press briefing, Gerrit Kazmaier, Google Cloud GM and VP of data analytics, noted that the new updates are driven by customer requests.
“One was an increased need for flexibility specifically now in the current year with all of its challenges,” said Kazmaier. “They’re asking for help to optimize for both their predictable and unpredictable data needs.” BigQuery gets smart about scaling Flexibility where users pay for what they use is an original promise of the cloud. It’s a promise that Google is helping to deliver on with the BigQuery Editions update.
Kazmaier said that BigQuery Editions offers multiple tiers of service with different feature set capabilities per tier, that customers can choose and select from. Organizations can also choose to mix and match tiers for individual workloads.
The new flexibility that BigQuery Editions provides is enabled by a few underlying infrastructure capabilities enhancements from Google for storage and auto-scaling. Kazmaier explained that BigQuery compressed storage provides access to data in a highly compressed format using a proprietary multistage compression process. The end result is that organizations will be able to store more data for less cost.
New auto-scaling capability The flexibility provided by BigQuery Editions is also enabled by way of a new auto-scaling capability for workloads. Kazmaier noted that Google built out a new resource scheduler as part of the BigQuery Editions infrastructure for doing query planning and execution. He explained that a query basically can get compute resources on the fly, as it processes operations.
Kazmaier also provided an update on the BigQuery ML service, which first became available in 2019. BigQuery ML integrates the data warehouse with machine learning (ML), such that organizations can use the data of AI model development.
Over the last year, Kazmaier said that Google has increased its focus on making machine ML accessible at scale and helping organizations connect it with their own data. A day ahead of the summit on March 28, Google announced an incremental update to BigQuery ML, allowing inference to be done using remotely hosted models, not just models that are directly integrated with the BigQuery service.
Google breaks AlloyDB out of its cloud A cloud database like AlloyDB, by definition, will typically only reside in the cloud, but that’s not always what organizations want or need.
During the press briefing, Andi Gutmans, VP and GM of Databases at Google, commented that many organizations want to run databases in different clouds and some still have a need to run on-premises. There can also be a fear among some users that having a technology only available to run in a single cloud provider can lead to a lock-in risk. The AlloyDB Omni database is an effort to answer that challenge by enabling users to run the database wherever they want.
This isn’t the first time that Google has unshackled one of its data technologies from its own cloud platform. In 2021, Google launched BigQuery Omni , which enables data queries to be run across multiple cloud providers. While BigQuery Omni enables multi-cloud support, the AlloyDB Omni is going a little further, by allowing users to download a full container image of the database. The container can be run in any environment that will support containers, whether that’s on-premises or another cloud provider.
The idea of removing the fear of lock-in also extends to Google’s views on the open source foundation of AlloyDB Omni, which is the PostgreSQL database.
“We want customers to be able to run on any PostgreSQL, whether that is AlloyDB or without us,” said Gutmans. “With any work that we do, including differentiated work, our goal is to really make sure that there is compatibility out there.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,577 | 2,019 |
"Databricks launches Delta Lake, an open source data lake reliability project | VentureBeat"
|
"https://venturebeat.com/ai/databricks-launches-delta-lake-an-open-source-data-lake-reliability-project"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Databricks launches Delta Lake, an open source data lake reliability project Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Databricks is launching open source project Delta Lake , which Databricks CEO and cofounder Ali Ghodsi calls the company’s biggest innovation to date, bigger even than its creation of the Apache Spark machine learning library. Delta Lake is a storage layer that sits on top of data lakes to ensure reliable data sources for machine learning and other data science-driven pursuits.
The announcement was made today at the Spark+ AI Summit in San Francisco and follows Databricks’ $250 million funding round in February, bringing the company’s valuation to $2.75 billion.
Data lakes, which offer a way to pool data and break down data silos, have grown in popularity with the rise of big data and machine learning.
Databricks’ cofounders created the Apache Spark machine learning library while they were students at the University of California, Berkeley. The Apache Software Foundation took over control of the project in 2013. Delta Lake is compatible with Apache Spark and MLflow , Databricks’ other open source project, which debuted last year.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Delta Lake looks at all the data that’s coming in and makes sure that this data adheres to the schema that you’ve specified. That way, any data that makes it into the Delta Lake will be correct and reliable,” Ghodsi said. “It adds full-blown ACID transaction to any operation you do on your Delta Lake, so operations on Delta Lake are always correct [and] you can never run into … partial errors or leftover data.” Delta Lake can operate in the cloud, in on-premise servers, or on devices like laptops. It can handle both batch and streaming sources of data.
“It lets you now mix batch and streaming data in ways that have been impossible in the past. In particular, you can have one table that you have streaming updates coming into, and you can have multiple concurrent readers that are reading it in streaming or batch. And all of this will just work because of the transaction, without any concurrency issues or corruption,” Ghodsi said.
A time travel feature will also allow users to access earlier versions of their data for audits or to reproduce MLflow machine learning experiments. The tool can handle Parquet files used to store large data sets.
A proprietary version of Delta Lake was made available to some Databricks customers a year ago and is now used by more than 1,000 organizations. Early adopters of Delta Lake include Viacom, McGraw-Hill, and Riot Games.
In other open-source news for Databricks today, Microsoft’s Azure Machine Learning joined the MLflow project.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,578 | 2,023 |
"Zoom Docs arrives to take on Google Docs, Notion | VentureBeat"
|
"https://venturebeat.com/virtual/zoom-docs-arrives-to-take-on-google-docs-notion"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Zoom Docs arrives to take on Google Docs, Notion Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The “end meeting” button may only be the start for Zoom users next year. At its annual Zoomtopia conference this week, Zoom showcased several significant updates and new products that aim to further streamline hybrid work and collaboration. Headlining the announcements was Zoom Docs, a fully-integrated AI-powered, multi-user cloud documentation solution built directly into the Zoom platform.
In a release, Zoom CEO Eric Yuan emphasized Zoom’s continued promise to evolve its platform through powerful AI capabilities. “Our new innovations demonstrate Zoom’s commitment to evolving our platform in ways that empower limitless human connection and solve real business problems,” said Yuan.
Zoom Docs offers a new hub for teamwork At its core, Zoom Docs provides document creation, editing, and collaboration features like other cloud document solutions. However, its tight integration across Zoom differentiates the product.
Documents, wikis, tables and other content can all be created, edited and searched for within Zoom Meetings, Team Chat or the Zoom web and desktop apps, offering a direct competitor to up-and-comer Notion.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Modular content blocks let users customize layouts to specific workflows through options like drag-and-drop table blocks, offering competition to the likes of project management software such as Monday or even content creation tools such as WordPress (which VentureBeat uses for this website) and Mailchimp (used for our newsletters).
Perhaps most innovative is Zoom Docs’ use of AI. Zoom’s AI Companion can auto-populate docs with insights from meetings to expedite creation. It also generates summaries of long-form content and helps users quickly find information across documents.
Building onto videoconferencing Initially announced in September , Zoom’s AI Companion earned further enhancements. The assistant debuted in the new Zoom Whiteboard, where it can help generate and organize ideas visually.
AI Companion also expanded its reach, with Zoom launching meeting and chat summarization for higher education and healthcare customers. It helps streamline tasks like catching up late to meetings, summarizing long threads and automating email responses.
Zoom further announced platform updates aiming to enhance the evolving hybrid workforce experience, including Workvivo, a recent acquisition, integrated directly into Zoom’s popular meeting client.
Zoom offered new meeting scheduling and “Huddles” tools, customizable scheduling links, integrated location maps and coworking presence indicators.
The path ahead With breakthrough products like Zoom Docs and expanded uses of AI Companion still in development stages, the company has an opportunity to gain ground.
However, Zoom will face fierce competition standing in its path to success as it launches the document software in 2024, with Google Docs, Microsoft Word 365, and Notion all offering versions of shared document creation, note-taking, and annotation/editing features.
Zoom also endured a self-inflicted wound earlier this year when it attempted to change its Terms of Service to enable more permissive uses of customer data for AI research and product development, prompting a fierce backlash and ultimately, a rollback of the very new terms it tried to introduce.
As Yuan said, “One thing will remain the same: effective collaboration and communication tools are crucial for businesses to succeed.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,579 | 2,023 |
"The gamification of work and reality: How gaming has moved beyond leisure | VentureBeat"
|
"https://venturebeat.com/virtual/the-gamification-of-work-and-reality-how-gaming-has-moved-beyond-leisure"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest The gamification of work and reality: How gaming has moved beyond leisure Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
As a society, we have had a long, strange relationship with video games. At times they are how we learn about new technologies like the computer or television; at others, they’ve been seen as the source of corruption for our youths or an addiction on par with banned substances.
Somewhere between these two polarities, there is a view that we can improve any number of aspects of our day-to-day life through the medium of video games, with the nature of work perhaps at the forefront of this discussion under the label of “gamification.” In reality, the influence of gamification on work has been mixed, and as increasing parts of our work and everyday lives are shifting to virtual worlds largely inspired by gaming, whether via a theorized metaverse or otherwise, the consequences of gamification on our work (if not on reality more generally) have become more relevant than ever.
Fulfilling needs that the real world can’t satisfy? Gamification as a solution for the ills of work seems to be a strange fit, given a societal obsession with productivity. In this light, frivolities such as gaming are perhaps the antithesis of this concept of work — time spent doing the opposite of something productive.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! However, it is perhaps for this very reason that games and gamification are seen as an ideal way to smooth over the more dull, repetitious or downright unpleasant parts of our work. Early technological optimists, such as Jane McGonigal, in her bestselling book Reality is Broken , claimed that reality doesn’t effectively motivate or inspire us, and sensibilities from gaming could change the very nature of work (or the world). In McGonigal’s view, games are productive because they are “fulfilling genuine human needs that the real world is unable to satisfy.” Taken at the extremes of this view, gaming has been seen as a refuge from the reality of the world of work rather than a means to improve it. One recent study claimed that in the early 2000s, work hours for young men dropped in greater numbers than older men or women, where leisure hours gravitated towards video games.
While it has been argued that this phenomenon is more of a shift in media consumption habits for young men than an absolute trade-off in gaming hours for work hours, what was consistent between this study and a more recent one from Oxford was a generalized increase in happiness or well-being from time spent playing games.
Desire for autonomy Games can make us happy by fulfilling needs, yet have not conclusively managed to improve the circumstances of work, given a focus on the nature of work or tasks therein rather the influence of managers or others who set the environment or structure of work.
Anthropologist David Graeber made the claim that an increasing number of employees were working in so-called “bullshit jobs” which often contributed only to the bureaucracy of an organization rather than any meaningful impact on the world.
This view too has been criticized on the basis that the underlying issue is, in actuality, the extent to which workers feel alienated from the decision-making process of their work rather than the type of job per se.
Essentially, we feel that work is bullshit when bad managers don’t respect or allow for autonomy.
Clashing worker/manager expectations All the while, symptoms of the ongoing erosion of trust between workers and managers have begun to manifest in new ways, most recently via ongoing dialogue around “quiet quitting.” An increasing number of employees have set themselves towards working only against the requirements of their job, with the reasonable expectation that more work or responsibilities should come with more pay.
Conversely, adversarial management believes that going above and beyond should be the norm for employees to advance, and those not willing to do so are self-selecting for attrition. These disparate positions reflect any number of rifts between employees and management, inclusive of generational shifts in attitudes towards work, although notably the focus on how work is structured rather than what the work entails.
Whether employees are finding themselves in so-called “bullshit jobs” or “quiet quitting,” any means to improve work through the application of gamification would be well served by addressing this problem, and yet many have had the opposite aim.
Reinforcing desirable behaviors with rewards Gamification expert Adrian Hon’s new book, You’ve Been Played , criticizes much of generic gamification as falling under behaviorist psychology. In this view, by reinforcing desirable behaviors with rewards, the desirable behavior will occur more due to incentivization.
While relying on a largely discredited intellectual basis, these mechanisms continued to be employed because they are cheap to implement and the novelty effect may demonstrate some short-term increases in desirable behaviors. While setting up scoreboards and the like doesn’t fundamentally change the crushing repetitiveness of some work tasks, a more troubling potential outcome is that these measures can effectively shift the blame from management to workers when ever-increasing targets are missed.
In this respect, generic gamification is, in actuality, the perfect fit for our efficiency-obsessed orientation towards work because it allows for both strict monitoring of performance akin to the antiquated notions of “scientific management” synonymous with “Taylorism” (after sociologist Fred W. Taylor), so much so that Hon describes the twenty-first-century workplace as increasingly governed by “Taylorism 2.0” or “Digital Taylorism.” View gamification with extreme caution The fact that gamification relies on largely discredited social science leads to the fact that it can only alleviate the more onerous parts of our work in a cursory way, while in some respects exacerbating the dynamics that tend to make for a negative work experience.
The deployment of these techniques should thus be viewed with extreme caution. And yet, as increasing amounts of work are shifted to virtual space, the potential for gamification to be a negative force in the workplace has expanded dramatically.
What many see as the ultimate setting for virtual work — the metaverse — has already raised alarms on the extent to which otherwise human behaviors can be modified or algorithmically controlled through the manipulation of persistent, interconnected, and embodied virtual worlds.
While this potential is troubling, it’s more likely the case that sophisticated algorithms may not be necessary: Some of those most aggressively pushing towards a future metaverse are defaulting towards the same basic philosophy of human control espoused by bad gamification.
Generic gamification concerns The blockchain-based Web3 view of the metaverse has become the epitome of behaviorist incentivization, where every action (from a “play to earn” game to participation in a community) can be incentivized with some kind of extrinsic reward, typically in the form of a non-fungible token.
The intrinsic value we get from satisfying behaviors is overridden by an ethos that any given action aligned with the interests of those controlling an experience can and should be incentivized with an inherently financialized reward.
We should be concerned with the applications and consequences of generic gamification mechanisms because in many cases, the potential future of the consumer internet is being built as a perfect fit for the most onerous types of gamification, and direct examples are becoming more common within Web3. These even go so far as to propose that the economically disadvantaged could simply find jobs as human background noise or “non-player characters” populating these worlds.
Gamification: Satisfying intrinsic needs The solution for the successful implementation of gamification in the workplace, improving employee and managerial tensions and crafting the potential metaverse (whether Web3-based or otherwise) all overlap: We as humans are at our best when we can satisfy our deeper intrinsic motivations (happiness, satisfaction), not just our extrinsic ones (money).
Satisfying intrinsic needs has always been at the core of the best gameplay experiences (many of which lack the signs associated with bad gamification, such as scoreboards, points, badges or otherwise), meaning that positive implementation of gamification is not impossible.
In Hon’s view, harmful gamification thrives when it denies us “the dignity of possessing intrinsic motivation.” It causes us to compete with ourselves in a way that amounts to little more than self-surveillance, allowing work (or otherwise) to better control behaviors because those being “played” are made to believe they are controlling them. Conversely, good gamification treats us as individuals and allows for deeper needs to be fulfilled.
The complex societal views of gaming and the metaverse The solve for bad gamification is as simple as orienting these mechanisms to be more like good (rewarding) games rather than tracking mechanisms, which like successful employee and managerial relationships, are heavily biased towards empathy and understanding.
As virtual work becomes more common and top talent demands geographic flexibility , successful organizations can leverage the distinction between good and bad gamification as a first step towards being attractive to this labor pool. Experiences such as the metaverse that originate from gaming are uniquely primed to capitalize on gaming’s superpower to fulfill intrinsic needs, although this direction has not yet been enough of a focus among those most active in constructing the metaverse or future of work.
Gamification and the metaverse have become top of mind because the relevance and power of video games have been on the rise.
Our understanding of gaming and its applications must go beyond its potential weaponization to how humans find satisfaction with them. Whether we are talking about gameplay, work or the future of the internet, focusing on true, intrinsic human motivation will always yield a more positive experience.
Jonathan Stringfield is VP of global business research and marketing at Activision Blizzard.
DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,580 | 2,023 |
"Slack-lash? Slack defends controversial redesign amid sharp criticism | VentureBeat"
|
"https://venturebeat.com/virtual/slack-lash-slack-defends-controversial-redesign-amid-sharp-criticism"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Slack-lash? Slack defends controversial redesign amid sharp criticism Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney If you are one of the estimated 18 million daily users of the instant messaging application Slack, you may be shocked to open the app one day soon and see that things look, well, quite a bit different. It’s as though your favorite coffee shop suddenly moved the furniture around — a disorienting experience, even if you ultimately approve of the changes. But of course, not everyone will.
Earlier this week, the Salesforce-owned company published a blog post announcing “a redesigned Slack, built for focus,” that is rolling out now for new teams and to existing users in the coming months, which includes lots more white space and more nested view panes.
The changes didn’t go over smoothly, however. Several users — including other tech founders, executives and influencers — took to social media (predominantly X, formerly known as Twitter) to decry the new direction. Now Slack has responded with a defense of the redesign in a statement emailed to VentureBeat. Read on to see what the company says.
What Slack changed in the redesign The changes include a new “Home” button that consolidates all your Slack channels and direct messages into a single view pane, including across multiple workspaces; a new Direct Messages (DMs) tab for one-on-one messages located just below the Home button; a new “Activity” section with all the updates to your various channels, workspaces and messaging conversations; and conversation-specific notification bubbles nested deeper within each view.
“With its better organization and more intuitive layout, you’ll be able to get work done faster,” the Slack blog post reads, saying the changes are designed to reduce distractions and improve focus.
Fast and pointed criticism Within hours of the Slack redesign being announced, the complaints began on X.
“The Slack redesign is perfect because what you want in a professional tool is to be able to see half of the information you need and for everything to be three clicks away with enormous amount of white space,” posted a sarcastic Austen Allred, founder and CEO of Bloom Tech, a coding school that offers an income-share program for those who don’t wish to pay for tuition.
The Slack redesign is perfect because what you want in a professional tool is to be able to see half of the information you need and for everything to be three clicks away with enormous amount of white space pic.twitter.com/cWmtVVyZDU Similar sentiments followed from Shopify CEO Tobi Lutke and human interface designer Ilya Miskov. The latter called the redesign “so 2010.” Every design team seems to succumb to low information density and high whitespace aesthetics even if the tool is mostly used by pros.
Glad we went the other way with the recent work on Polaris. Bring back tight and tactile UX! https://t.co/imROv8MJYt This Slack redesign feels so 2010… pic.twitter.com/a132AkHlBS Even within VentureBeat, a Slack customer, staffers were not convinced the new design was helpful. “I am not updating my app until they fix this,” wrote one team member.
Slack’s defense of the changes Still, Slack and its parent company Salesforce seem committed to the new design. When contacted by VentureBeat about the vocal backlash (Slack-lash?) on X, Ethan Eismann, SVP of product design at Slack provided the following statement defending the changes, including a diagram illustrating how the redesign includes larger views (in pixels) of both the conversation panes and channel list.
“When we kicked off the redesign process, we started by looking at user feedback to identify the biggest pain points. One of the most common pieces of feedback was that Slack can feel overwhelming, both in notification volume and the different places to access all of the information needed to get work done.
“What we’ve aimed to do is put focus front and center, and turn down the volume on cognitive overload by dedicating more space to your tasks at hand. The redesign gives users space where they can focus on the most important things to their work, without worrying about losing notifications or critical updates.
“As an example, see the before/after image below of our new Home view, which is focused on channel communication. Since this is the primary place where users spend their time, we’ve increased the available space to view your channel list and DMs, as well as the conversation. We know some users prefer more information density, and some prefer less. We aimed to strike the right balance.” Those of us who have been following tech and consumer-facing apps for a while know that nearly any redesign is met with initial pushback, before users ultimately quiet down and accept the changes as the “new normal.” This was the case certainly with numerous Facebook, Microsoft and even Apple software changes over the years. Slack’s redesign is likely to follow a similar acceptance curve, but until then, the company is sticking by it through the initial rejection phase. We’ll see how long it takes users to come around to it — if they ever do.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,581 | 2,023 |
"ShapesXR raises funding to expand its XR prototyping platform | VentureBeat"
|
"https://venturebeat.com/virtual/shapesxr-raises-funding-to-expand-its-xr-prototyping-platform"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages ShapesXR raises funding to expand its XR prototyping platform Share on Facebook Share on X Share on LinkedIn ShapesXR , a virtual reality (VR) design and collaboration platform, announced it has secured an $8.6 million seed round led by Supernode Global.
Other investors including Triptyq VC, Boost VC, Hartmann Capital and Geek Ventures also participated in this funding round.
The platform’s mission revolves is to simplify 3D content creation and spatial design. The company aims to achieve this by offering an intuitive interface that empowers non-technical users to craft intricate 3D designs and prototype immersive applications, including virtual reality (VR) and augmented reality (AR) games, training programs and real-world designs.
ShapesXR says that industry leaders such as Logitech, ByteDance, Qualcomm Technologies, Meta, Accenture, Stanford University and MIT have embraced the platform because of its robust ideation and collaboration capabilities. The company’s aspires to cement a position as the de facto industry standard for user interface/user experience (UI/UX) design in the realm of spatial computing.
“With the new funding, we are aiming to expand ShapesXR’s compatibility with the latest major VR devices, such as Apple Vision Pro, Pico and Magic Leap. We will also be integrating our platform with the most common development pipelines, to enable a smooth transition to and from ShapesXR,” Inga Petryaevskaya, CEO and founder of ShapesXR, told VentureBeat. ”By supporting all types of possible inputs — controllers, hand pinch and eye gaze — we will allow people to build the best experience for the platform.” Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Petryaevskaya stated that one of the platform’s key differentiators is that it enables designers to create at a human scale, directly within the media where they’re constructing their apps. This eliminates the typical time-consuming transition from flat screens to headsets and glasses, which often leads to compromises on ergonomics and design integrity.
“The DNA of ShapesXR is the ability to show the design in motion without any line of code — one can build a storyboard or an interactive prototype that can be tested and interacted with,” said Petryaevskaya. “We give UI/UX XR designers the power to own the design process and iterate on the design, [and] run user tests without the need to involve the developer early on. It saves a ton of time and ends up in better, magical designs.” In a recent endorsement, Mark Zuckerberg showcased ShapesXR as an exemplar of VR’s ability to nurture creativity and collaborative endeavors.
Streamlining AR/VR creation for developers of all stripes Petryaevskaya explains that the platform’s user-friendly interface obviates the need for 3D skills, facilitating early engagement by teams and stakeholders in the design process.
Logitech, the company says, recently harnessed ShapesXR for incubating collaborative 3D ideas. Marketers and designers collaborated to conceive an entire Berlin conference, sculpting its essence in 3D.
“It was designed in 3D with all their different stakeholders working together to finalize the look, feel and layout of the venue. They were able to see how participants would interact with booths and features, and showcase the whole design to prospective sponsors,” said Petryaevskaya.
Likewise, the company said that the design team at Pico, which is part of ByteDance, is exploring the future of user interactions through ShapesXR as it builds various spatial apps and a spatial OS.
“Accessibility is crucial for 3D content creation. The lower the bar to entry, the more people can leverage the power of VR and AR — whether it is creating new games or videos, [or] designing products and customer experiences. Our platform allows users [to] virtualize themselves in relation to the space or world around them,” explained Petryaevskaya. “You can make changes to [a] house design at real scale or the size of a doll’s house. Also, you can manipulate objects very close to how you do in real life, and if you don’t need something, just throw it away, just like you would do with anything in the real world.” ShapesXR is currently accessible on Meta Quest 2, Quest Pro and the forthcoming Quest 3. Petryaevskaya highlighted that a substantial funding infusion will extend the platform’s adaptability to other devices, including Apple Vision Pro, Pico and Magic Leap. This caters to visionOS developers’ surging interest, enabling them to prototype eye gaze and hand pinch interactions using ShapesXR.
She emphasized that designing for the specific device and prototyping interactions prior to coding allows for early user testing and iterative experimentation.
“ShapesXR on Vision Pro will accelerate how quickly app developers can create versions of their applications that feel authentic to visionOS. Designers experience dimension, immersion and ergonomics while designing inside Vision Pro, shortening the iteration cycle,” said Petryaevskaya. “Our Figma and Unity plugins integrate Shapes into existing workflows. When teams design in the medium they are able to find the right design faster and communicate it to the rest of the team and stakeholders.” A future of opportunities in 3D content creation Petryaevskaya said that facilitating seamless 3D content creation involves striking a delicate equilibrium: melding an intuitive user interface with the requisite functionality and flexibility to address diverse tasks. She said that community feedback propelled the company towards this iterative refinement, culminating in the optimal fusion of form and function.
“3D content creation is one of the most important trends in technology. It has the capacity to revolutionize so many different aspects of life and business. However, to get to that point, there is a need to create the fundamental tools and platforms that will support mass adoption,” Petryaevskaya told VentureBeat. “Also, all of our investors are true believers that spatial computing will change the way people work, learn and entertain and they like our vision as the company that enables apps and experiences for the newer platforms.” Petryaevskaya says that the infusion of new investors will augment the company’s wealth of expertise, fostering the evolution of ShapesXR by preemptively catering to industry segments’ distinct requirements.
“Our team accumulates solid experience building VR content. We’ve been experiencing huge limitations of flat screen tools,” she said. “We know a lot about what works and what does not work in VR, and we will work hard to enable XR creators to build truly spatial, truly immersive apps with delightful user experience; we do not want them to take all the legacy of the flat screen into the spatial internet.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,582 | 2,023 |
"Salesforce debuts Sales Elevate, bringing its cloud into Slack | VentureBeat"
|
"https://venturebeat.com/virtual/salesforce-debuts-slack-sales-elevate-a-new-experience-to-streamline-selling"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Salesforce debuts Sales Elevate, bringing its cloud into Slack Share on Facebook Share on X Share on LinkedIn Slack Sales Elevate Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Salesforce is deeply integrating Sales Cloud with Slack. The Marc Benioff-led CRM giant today said it is launching Slack Sales Elevate, a new experience in the work collaboration platform that will enable sales executives and managers to keep tabs on active opportunities and close deals more seamlessly.
The offering, which goes into general availability today, will come at an additional cost of $60 per user per month — over Slack’s regular paid plans — and will save executives from going back to Sales Cloud for basic day-to-day tasks such as updating their pipelines and adding deal values.
“Bringing Sales Cloud into Slack and providing new sales productivity tools and automation in Slack helps sellers save time and access the right people and information to make better decisions,” Rob Seaman, SVP of product at Slack, said in a statement. “A Slack-based approach to selling will make it easier than ever for Sales Cloud customers to focus on the work that matters: Working with customers and closing deals.” How will Slack Sales Elevate help? Companies across sectors want sales reps to double down on their selling activities and drive revenues. However, the reality is far from this expectation, as most executives remain bogged down in administrative work and cumbersome tools, leading to less efficient selling. According to Salesforce’s State of Sales report, reps currently spend more than 70% of their time on non-selling tasks.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! “Throughout the course of a week, sales reps are constantly communicating with their colleagues and customers,” Seaman told VentureBeat. “They’re figuring out things like how much a deal is worth or when it could be closed, and taking notes of all this along with what the next steps are. Then, when the manager asks for a forecast, they go get all these notes and update them in batch in Salesforce.” Slack Sales Elevate brings a personalized selling home on Slack, centralizing information from Sales Cloud and giving executives data and admin tools right where they are having their conversations about selling.
Offering a bird’s eye view As the company explains, the offering brings a new Sales tab to Slack, where reps can go to get a bird’s-eye view of their open opportunities, opportunities closing soon, and the revenue generated. It pulls in data from the Sales Cloud and even allows the executives to take action on their pipeline without moving out of Slack.
For instance, executives could bump up the amount for an account, change the stage it is in, or add the next steps planned to close the deal. As the changes are made, they are updated directly back in Salesforce.
More importantly, the new experience also ties in with notifications and reminders. For example, executives can be notified about updating their opportunity pipeline before a forecast call. Meanwhile, sales managers who get a view of all team opportunities with more comprehensive pipeline tools can set up automatic notifications for all kinds of deal updates — be it stage change for a specific opportunity, amount updates or new opportunities.
This way, when an executive closes a deal and updates that information on their opportunity pipeline within the Sales tab, the manager can be notified right away — with all the changes syncing back to Salesforce at the same time. No more need to run a report or go on a one-on-one.
Notably, Slack Sales Elevate also allows users to leverage AI-powered, no-code workflows with Salesforce-triggered notifications, like support requests and deal approvals, without any technical expertise. So, when a new opportunity from Sales Cloud updates in Slack, a workflow could use that data to generate and send a note in an account channel to alert the sales rep to follow up. However, this is not yet available in Slack Sales Elevate.
Expanding the use of Salesforce While Slack Sales Elevate will save sales reps’ time, Seaman emphasized it is not aimed at stopping users from accessing Salesforce. In fact, the offering is just an extension of Sales Cloud, giving a real-time view and extending its usage.
“We have a concept that we call thick work and thin work,” he explained. “Thin work is something like updating your opportunities or being told something happened, while thick work is more comprehensive stuff, like realigning sales territories or configuring a complex product with a lot of options. That’s a natural point where you would bounce from Slack to Salesforce. We think it’ll result in more usage of Salesforce as a result.” The Slack SVP noted that the offering is being tested by “tens of thousands” of enterprise users, including those from.
Box and Roku.
Sales reps have seen a 76% time reduction in managing their pipelines, he added.
Moving ahead, the company also plans to improve the experience of Slack Sales Elevate and focus on improving access to other Salesforce segments, including marketing and service. However, Seaman noted that there’s no timeline for it yet.
“We’re charting a path for these kinds of domain-specific experiences within Slack,” he said. “But what we’ll probably do first is enhance the experience of Sales — like you want to know when one of your customers logs a case, or you want to know when the customer has been included in a marketing campaign. We’ll probably start ingesting that information for notification purposes with Sales and then nail this experience before we start moving into service or marketing.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,583 | 2,023 |
"Reskilling and upskilling to future-proof your company strategy and stay competitive | VentureBeat"
|
"https://venturebeat.com/virtual/reskilling-and-upskilling-to-future-proof-your-company-strategy-and-stay-competitive"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Spotlight Reskilling and upskilling to future-proof your company strategy and stay competitive Share on Facebook Share on X Share on LinkedIn Presented by Skillsoft Under constant pressure to attract and retain talent and remain competitive, companies are turning to upskilling and reskilling, the key to future-proofing your organization and closing the skills gap. Learn how to develop a robust upskilling strategy that works in this VB Spotlight.
Watch free, on-demand! Skills and training have long been seen as a cost center, rather than a core competency. But that’s no longer the case in a rapidly developing digital landscape where great talent is often thin on the ground, says Marianne Groth, director of talent development at Lumen.
“Organizations need to upskill if they want to be viable. Change is essential,” Groth says. “You have to do it to remain in business today. And then what your talent development or HR team can do is foster that continuous learning culture.” The benefits of an upskilling partner One of the biggest pain points for organizations ready to implement an upskilling program is the fact that there are so many open positions, plus onboarding and training new employees means interrupting the day-to-day of critical subject matter experts.
The ideal training partner would offer tools that make that process significantly more efficient. Organizations also need to find a partner that offers the timely, cutting-edge content necessary for both day-to-day business needs and keeps pace with emerging technologies today. It should be able to provide updated learning solutions on the fly, that also align with the organization’s skills framework.
Plus, organizations need to do a deep-dive assessment of their in-house expertise. They need to understand their skills inventory, to assess where their skill gaps and needs lie and identify risks and weaknesses. Taking that to the next level, they also need to determine if the employees they tap for upskilling or reskilling have the chops to handle the job in the real world. And as the training program progresses, they should be able to track the measurable skill gain.
“That’s quite a project to undertake as an employer, as a company,” Groth says. “You want to make sure that your employees have access to the skills that they need, but then we also have a mentor or a manager that works with that person to look and see, can they do this work? It’s imperative to learn the skills, train and test, and practice the skills in a safe environment. Then work with a mentor to do that in a production system, and then being able to go off and do it on your own.” An upskilling partner brings that very specific assessment experience and proficiency, plus the resources necessary to scale learning programs easily, from small groups to entire departments or even the whole organization, she adds.
A partner also brings along the ability to offer an array of learning modalities for the broad variety of learning styles you are bound to find in any organization, whatever the size, and across different types of skill sets.
“More and more, as we work with customers closely, they give us feedback that the more varieties of learning modes we can make available to them, the better the skill acquisition,” says Greg Fuller, senior director, tech & dev – content development at Skillsoft. “It’s a one plus one equals three scenario.” Having dedicated experts and mentors available means that instead of scrambling to find an internal expert, employees have knowledge and guidance right there while they are learning. Curated learning also gives employees direct access to the pertinent information and data they need as they train, without having to hunt through a wilderness of drives and directories.
Measuring learning success Benchmarking and measuring the success of an upskilling and reskilling program is crucial, and that’s where organizations can look at their training partner as an extension of their organizational transformation goals. The measurement cannot simply be whether learners complete a course; it’s about analyzing the data throughout the process.
Companies should be able to measure the pace at which employees are acquiring skills, whether they’re actually being applied to the job, and if they’re getting value out of their content partners – all things enterprises might not have the capability to do on their own.
“The key criterion is that partnership,” Groth says. “Do we have folks that understand what our business-critical needs are and help us get there? Do we have the support? And then can we show the value for that investment? Those are the critical components.” Platforms like Skillsoft help customers aggregate that data to make it easily reportable and understandable as their learners are going through their journeys, Fuller explains.
For instance, Lumen recently launched a digital savvy program in March to address the need to bring employees up to speed on the rapidly changing landscape, from digital transformation technologies to digital strategy user design and experience, and more.
In the first two weeks, the intranet site had 1323 page views and more than 4000 engage notes and questions, Groth says, with more than 19,511 messages to the talent development team as the program rolled out. And from lesson completion and micro-learning videos, they’ve seen more than 749 skill benchmarks completed.
Gaining employee buy-in When Lumen launched the program, they were also clear on why bringing their employees along for the journey, and their talent, was so important to keep them moving toward the future.
“We’re communicating why they’re important to our organization, what we’ve been doing for our customers in these areas, what we’re doing internally for our employees, and then giving our employees the opportunities to dig in and learn more,” Groth says.
Aligning skills training to business objectives helps employees understand how what they do affects the business, which helps encourage buy-in. But it’s not just about today’s skills, she says – it’s about the future. It’s key to offer professional development opportunities that are tied to an employee’s career goals and make it fun by adding recognition or competition to the learning process. Part of that should come from employee testimonials: encouraging them to share their training experience and spotlighting their growth.
“How did this certification help you land a new promotion? Those are awesome stories to share, especially with the leaders,” she says. “You get a full circle. The leaders are seeing that we’re supporting this. Our employees are doing this. You get those testimonials back to the leaders and they’re all in.” Plus, employee engagement such as sharing scores and winning points gets employees excited about learning, skilling programs take off and metrics continue to improve.
To learn more about the benefits of upskilling programs, why keeping employee skills up-to-date as technology evolves is crucial to remain competitive and more, don’t miss this VB Spotlight event! Watch on-demand! Agenda Why it’s critical to address the skills gap now How reskilling and upskilling can future-proof company strategy Why benchmarking is critical for a robust learning program Optimizing and implementing reskilling curriculums And more Presenters Marianne Groth , Director of Talent Development, Lumen Greg Fuller , Senior Director, Tech & Dev – Content Development, Skillsoft Art Cole , Moderator, VentureBeat VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,584 | 2,023 |
"Meeting the challenge of skill gaps in the age of digital transformation | VentureBeat"
|
"https://venturebeat.com/virtual/meeting-the-challenge-of-skill-gaps-in-the-age-of-digital-transformation"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Spotlight Meeting the challenge of skill gaps in the age of digital transformation Share on Facebook Share on X Share on LinkedIn Presented by Skillsoft Upskilling and reskilling are key to future proofing your organization and closing the growing skills gap. In this VB Spotlight, learn how to develop a robust upskilling strategy optimized for how employees learn, address the full spectrum of technical training needs and more.
Register to watch free, on-demand The US is experiencing record unemployment rates, but despite the big-name layoffs in the headlines, the actual number of new jobs being created is outpacing these big org changes. This comes in large part from the number of companies, in every industry, turning toward digital transformation; which has in turn, led to increasingly large skill gaps, particularly as the oldest of the working generations retires from the job market, taking their job knowledge with them.
“Upskilling right now is probably in higher demand than anything we’ve ever seen,” says Greg Fuller, senior director, tech & dev – content development, at Skillsoft. “Companies are focused on building a skill inventory to backfill a lot of these positions, and they’re taking the time to understand what skills exist within their own ecosystems, so they can see what opportunities there are to upskill and reskill internally before they look externally.” Upskilling trends and challenges The biggest trend is upskilling technology literacy for previously non-technical workers, as the number of digital solutions proliferate in workers’ daily lives — cloud-based tools or automation-focused tools, for instance. The other growing gap is in the cloud space. A handful of years ago, companies were putting in a lot of effort to migrate into the cloud. Now the focus is on optimizing, whether that’s bringing in multiple vendors to work with, or managing their own cloud infrastructure.
Mission critical skills, for example, when an organization might be changing from one technology stack to another, or going from traditional on-premises to cloud, or shifting to a multi-cloud infrastructure, require upskilling in the short term. While larger organizational transformations, such as moving from traditional project management practices to agile methodologies can be a four to six month trajectory. Longer-term strategies are also emerging, when organizations, looking at the market and technology transformations that have occurred in the past handful of years, are looking for ways to remediate risk they have, especially from leadership and senior manager standpoints.
Regardless of the role or skills involved, or the length of the program, companies are finding it challenging to scale their internally built upskill programs.
“That’s probably the biggest change that we’ve seen in the last four or five years,” Fuller adds. “It’s not sending 20 or 40 people to upskill. They’re looking to upskill in the hundreds and thousands of learners at a time.” One of biggest barriers, whether a program is launched at an organizational level or a departmental level, is that record unemployment rates and the need to backfill positions means that internal experts are diverted from their critical day-to-day activities to either build or deliver these programs. It’s certainly possible for one-off activities, but that’s far from scalable or maintainable.
Best practices for upskilling programs Today reskilling and upskilling is part of an organization’s larger strategy, and first and foremost, it’s crucial to plan and implement it like any business strategy. That means identifying the most critical problems to address – what are your mission critical skills, and where do your gaps lie? From there, it’s considering the key outcomes you’re trying to achieve. It’s also knowing what risks exist if you don’t take up these programs, just like any other project or initiative that the organization takes on. And a lot of thought must be put into implementing upskilling at scale as well.
“It’s easy to solve a problem for one point in time, but you really have to ask, how can this solution be scaled?” Fuller says. “It’s an especially important question when you consider the very mixed composition of organizations today, in which very different personalities and different generations must work side by side.” And that’s where another critical issue to consider comes in – ensuring that an upskilling program is interesting and achievable for all types of learners. To make that learning journey personal for the individual, as well as effective in helping them move toward their career development goals, takes leveraging multiple styles, multiple modalities, formats and so on.
But what organizations really need to consider is making sure that these upskilling programs are measurable and tangible. A company should be able to measure progress along the way. At every milestone the program should be reassessed to make sure it’s headed in the right direction – and if not, how to retool the strategy to make sure it’s tracking to meet those business objectives. And benchmarks are also key for ensuring that employees are taking training in line with organizational priorities.
Creating a culture of learning, in which employees are all-in on their own reskilling, needs to come from the top, Fuller adds.
“Today, an organization’s reskilling success starts with the leadership,” he explains. “The executive level must make sure the right environment is built at the top, because it feeds down through each level of management, and helps ensure buy-in throughout the company.” Register to watch on-demand! Agenda Why it’s critical to address the skills gap now How reskilling and upskilling can future-proof company strategy Why benchmarking is critical for a robust learning program Optimizing and implementing reskilling curriculums And more Presenters Marianne Groth , Director of Talent Development, Lumen Greg Fuller , Senior Director, Tech & Dev – Content Development, Skillsoft Art Cole , Moderator, VentureBeat The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,585 | 2,023 |
"How the neuroscience of VR can help tech teams break critical barriers | VentureBeat"
|
"https://venturebeat.com/virtual/how-the-neuroscience-of-vr-can-help-tech-teams-break-critical-barriers"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest How the neuroscience of VR can help tech teams break critical barriers Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Few emerging product categories garner as much attention, enthusiasm and speculation as Virtual Reality (VR). Frequently associated with large goggle rigs featured in internet fail videos or speculation about the future metaverse, this technology is both emerging and already incredibly powerful.
What’s more, it might be the tool that helps tech companies respond and adapt to today’s uniquely disruptive moment for a sector accustomed to prolific, seemingly-unbridled growth and innovation.
Whether they are responding to new regulatory standards, adopting enhanced cybersecurity protocols or integrating novel technologies like artificial intelligence (AI), it’s clear that the status quo has been upended.
Digital tools central to training At the same time, the tech sector is reeling from the Great Resignation, which widened the already chasm-like gap between supply and demand for tech workers. Even for fully staffed companies, 76% of IT decision-makers are dealing with “critical skills gaps on their teams.” Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! To help support increasingly-distributed teams, augment employees’ skills and capabilities or implement new standards and technologies, tech companies are doubling down on employee training and digital tools have become central to those efforts. Unfortunately, these efforts often lack the lived experience and exposure that produce effective and reliable real-world results.
VR can bridge this gap by engaging the brain in the same way as lived experiences and accelerating outcomes accordingly. The neuroscience of VR provides compelling reasons to look to this technology to support teams while helping identify the best ways to harness VR to accelerate critical outcomes.
The neuroscience of VR VR is a technological marvel that relies on strategically positioned lenses to distort images and make them appear three-dimensional. The technology is nearly half a century old, but its capabilities are more realized, recognizable and accessible in today’s cutting-edge environment.
The technology exposes users to life-like experiences that engage the brain in tangible ways. For starters, VR blends bodily control and functionality with a compelling ecological reality. Our brains constantly make predictions about our actions, concepts and emotions. VR follows the same principle, using advanced computing power to predict the sensory consequences of particular movements.
The impact on our brain is profound. According to research published in the National Library of Medicine , “VR can be considered an advanced imaginal system: An advanced form of imagery that is as effective as reality in inducing experiences and emotions.” This singularity, known as immersion, is compelling and convincing. As the NEO Academy helpfully explains, “When we put on a VR headset, we’re effectively transported into a digital world. Our brains receive visual, auditory and other sensory input that trick us into thinking we’re somewhere else.” While this unique experience is frequently associated with the still-futuristic metaverse or videogame use cases, the ramifications for businesses are formidable. Specifically, VR can help tech teams adapt to a changing environment and new challenges.
How VR can help break barriers Learning new skills, navigating unfamiliar environments or tackling formidable challenges are incredibly difficult tasks. For tech companies looking to equip employees to meet onerous demands, the neuroscience of VR presents a compelling opportunity to leverage this increasingly capable technology to help teams break critical barriers.
Here are several ways that tech companies can leverage VR to help facilitate positive outcomes.
Provide sales teams with “real-world” training According to an analysis published by Nature , “VR continues to accrue confirming evidence for the treatment of phobias owing to its ability to provide powerful sensory illusions within a highly controlled environment.” Given this functionality, it’s not hard to imagine sales teams leveraging VR to help people overcome their fear of public speaking, failure or other experiential facets, allowing them to enhance or refine their efforts before engaging in customer-facing obligations.
Notably, VR can be used to teach soft skills , helping teams operate with greater empathy and more effective communication skills. This flexible software-based training solution can be adapted to accommodate a variety of use cases for companies, leveling up their training initiatives at a critical time.
Enable better connections across physical distances A May 2022 Morning Consult survey found that nearly half of tech workers are fully remote and 85% at least embrace a hybrid model. This new work arrangement has many benefits, allowing people to avoid commutes in crowded cities and better restore work/life balance. It’s also leading to higher levels of disconnection among teams.
VR can help create these connections. Whether leaders leverage the technology to conduct virtual team-building activities or enable face-to-face encounters across distances, VR can be a conduit for fostering better connections amongst increasingly distributed teams.
Teach new skills and content Real-world settings like classrooms, onboarding meetings, seminars or online trainings have physical limitations that limit learning potential. In contrast, VR is limitless, giving people access to once-impossible learning experiences, hands-on scenarios and other growth-oriented initiatives.
This isn’t just a hypothetical future. A 2022 study by the University of North Carolina discovered that medical students learning in VR settings outperformed their peers in information retention, performance tasks and test scores.
Additionally, a 2018 study using VR to teach adaptive flight training to Air Force pilots found that participants had a 230% performance task improvement compared to traditionally trained pilots.
As tech companies look to upskill existing workers or train new hires, VR can help enhance these efforts. Since a company’s people are its most valuable resource, outfitting them with skills needed to thrive can be a differentiating factor that allows them to thrive in the months and years ahead.
Using VR to enhance real-world outcomes According to an expansive analysis of 25 different published articles on VR’s impact on health results, “VR compares favorably to existing treatments in anxiety disorders, eating and weight disorders and pain management that generalized in the real world.” In other words, it’s possible to use VR to enhance tangible outcomes.
For tech companies, this means VR can play a practical role in their evolution and advancement. Whether helping customer service agents develop empathy or cultivating workplace culture in a hybrid work environment, the neuroscience of VR proves that it can be a valuable tool that helps tech companies break critical barriers.
Marshall Mosher is founder and CEO of Vestigo.
DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,586 | 2,023 |
"From battlefield to homefront: AR is bigger than the metaverse | VentureBeat"
|
"https://venturebeat.com/virtual/from-battlefield-to-homefront-ar-bigger-than-metaverse"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest From battlefield to homefront: AR is bigger than the metaverse Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Throughout modern history, the military has supplemented civilian technological innovation. The first prototype of the internet was famously funded by the U.S. Department of Defense in 1969 to enable inter-computer communication through a unified network. The DOD decommissioned the project in 1989, paving the way for the civilian internet’s launch in 1990.
Private-sector tech innovation has thrived since then — think of Bitcoin, Google search, deepfakes and ChatGPT. So much so that it’s easy to forget about the cyber tech and AR/VR technology that the military constantly produces and tests. The latter tech is particularly interesting, considering 2021’s metaverse bubble and speculation over how AR/VR will transform modern life.
Just as the military helped spark the digital revolution, it’s instructive to take note of how its use of AR/VR today could come to define the famously elusive metaverse tomorrow for civilians.
AR/VR gaming that’s actually convincing Armies across the globe are undergoing intense combat training with AR, thanks to advances in data and graphic processing. Microsoft’s Hololens IVAS (integrated visual augmentation system) will be delivered to U.S. soldiers in the field this year, with the intention of giving them visual superpowers similar to those found in first-person shooting games such as Call of Duty or Battlefield. Microsoft hasn’t yet hinted at when the commercial version of the Hololens will be made available.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Imagine the potential here. We’re talking about an AR system so advanced that it helps soldiers fight in real, physical battlefields more effectively. It’s only a matter of time before such technology will be available for consumers, and that will change the gaming paradigm and, by extension, make the metaverse much more viable.
One of the most potent critiques of the metaverse was that the vast majority of people would rather do things in the physical world than the digital world because most metaverse applications weren’t nearly as convincing enough at simulating, let alone enhancing, real life. Tech that can enhance a soldier’s performance in battle can likely enhance other real-world experiences too, making the oft-leveled metaverse critique irrelevant.
And consumer-level AR headsets already on the market allow gamers to get an even more realistic simulated battlefield experience. Arcade-style VR game guns are also already available and will become more prevalent as growing numbers of gamers migrate from traditional consoles to AR/VR-compatible gaming systems. With time and improved headsets and equipment, AR will be able to revolutionize the gaming experience.
Of course, military or first-person shooter-style games aren’t the only types of games that stand to benefit from future developments in AR. The same holds true for sports games, open-world, online battle royale games, role-playing games (RPG) and other genres.
Better training for firefighters and cops These days, we can take predictions from foreign militaries in addition to the U.S. military. The Korean Military Academy , for example, began working with local technology companies to develop AR and VR combat training programs for its army cadets. The use of AR and VR to simulate combat training allowed South Korea to save considerable amounts of money on large-scale exercises while simultaneously reducing the chances of injuries and serious accidents.
In civilian life, firefighters can use VR to effectively simulate the physicality of complex, high-pressure rescue scenarios without risking injury. Using advanced AR equipment, a firefighter can train and improve on every aspect of the job, ranging from rapidly sliding down the pole and putting on their uniform to carrying a person from a burning building, all while in the safety of the firehouse.
Police officers, too, could benefit from life-like virtual training. Debates surrounding police brutality in the U.S. over the past years have brought to light the fact that cops simply don’t get enough training — either before they start doing field work or after years of service. VR can dramatically change that. The same principle applies to disaster-relief teams and other professions that require physical action and judgment calls in high-intensity environments.
A more immersive retail experience AR and VR can also be effective tools for retail businesses to engage their customers in more meaningful and creative ways. With an AR headset, a basketball and a small little hardwood court, stores like Footlocker could give their customers a more immersive shoe-buying experience by enabling them to visualize themselves with their shoes in an augmented basketball setting. Fashion stores can do the same for their customers by allowing them to try on items and see how they will match with various shoes and other accessories.
No more video chat for repair instructions Beyond simulating combat training, militaries are also using AR/VR to service and maintain vehicles, aircraft and other equipment. Through AR software and hardware, maintaining and repairing these complex machines becomes easier and more convenient, especially when the personnel with manual or deep technical know-how aren’t on site.
Since AR is being used for maintaining heavy machinery in the military sector, it can easily be used in the same capacity in mechanic shops, factories and construction sites. Imagine a scenario in which a conveyor belt breaks down in a Tesla plant, making it impossible to move parts around the factory, and the technician capable of handling the repairs is hundreds of miles away. Through AR software, the plant’s shift manager can contact the technician who can then instruct the on-site staff how to get the conveyor belt up and running.
With AR increasingly sought-after and used by militaries, it is only logical to predict that these advanced use cases will be adopted and tailored for civilian life. Despite both AR, and especially VR, having a way to go before becoming a daily fixture in our lives, Big Tech players are already diligently investing and building the hardware infrastructure they require. Now it’s up to software engineers and Web3 developers to keep pace with them.
Adrian Krion is CEO and founder of Spielworks.
DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,587 | 2,023 |
"Emperia outfits Tommy Hilfiger with cross-metaverse virtual hub | VentureBeat"
|
"https://venturebeat.com/virtual/emperia-outfits-tommy-hilfiger-with-cross-metaverse-virtual-hub"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Emperia outfits Tommy Hilfiger with cross-metaverse virtual hub Share on Facebook Share on X Share on LinkedIn Premium lifestyle brand Tommy Hilfiger today launched a cross-metaverse virtual hub in partnership with 3D technology and virtual reality (VR) platform provider Emperia.
As part of the launch, the retailer is simultaneously unveiling several virtual experiences across various platforms, including Decentraland, Roblox, Spatial, DressX and Ready Player Me.
To simplify the process of navigating among these virtual worlds, the Emperia platform will provide a central hub to easily move in and out of each experience.
The Tommy Hilfiger metaverse hub will be available online starting today.
Because today’s metaverse distribution requires placement on multiple non-integrating platforms, companies face special challenges in assuring consistent branding. Emperia’s interoperable approach, as in its collaboration with Tommy Hilfiger, bridges the gap and enables multi-world experiences to be integrated into brands’ Web3 and ecommerce strategies.
Event GamesBeat at the Game Awards We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited! Emperia’s gateway connects fragmented environments, providing users with a seamless experience that enables them to enjoy each platform’s unique capabilities without having to choose among them. As a result, the hub creates a new, unique brand experience said to exceed physical retail alternatives.
Streaming a cross-metaverse with style The Emperia-created hub features DressX-powered digital fashion, Web3 artist collaborations with Vinnie Hagar, AR features, a photo booth, gamification and a community-focused competition to create AI fashion. Tommy Hilfiger’s well-known “TH” monogram will appear across all platforms, creating a unified digital brand story, while providing movement between the retailer’s website and the various metaverses, delivering an end-to-end shopping journey with a unique impact.
Emperia’s rendering capabilities standardize graphic quality across platforms, creating an easily accessible and high-performance experience without requiring users to download any special software. The experience is available on almost any device.
Some metaverse platforms limit payment options to cryptocurrency.
By integrating with the retailer’s ecommerce platform, Emperia provides users with a wider range of payment options, reducing friction and increasing user confidence, resulting in higher online sales.
Overall, with the cross-metaverses hub, Emperia aims to introduce a new layer of interoperability, blurring the frontiers of Web3, and enabling connections between the metaverse, ecommerce, entertainment and direct performance, all backed by data. Meanwhile, Emperia’s dataset capabilities allow granular insights into the user journey and engagement across different metaverse experiences, enabling a cross-data approach that was never offered before.
Emperia and Tommy Hilfiger: Dressed for iconic success The Tommy Hilfiger digital hub also aims to improve the product experience by offering four exclusive items, with the iconic Varsity Jacket taking the lead, presented in various aesthetic representations across all platforms.
Customers can purchase the jacket in two forms: physical, connected to Tommy’s ecommerce platform, and digital, connected to the DressX digital fashion platform. The Emperia hub provides access to the physical jacket for sale, while the Ready Player Me platform offers the digital version, which can be used across various games and environments, increasing the interoperability options.
“Emperia is continuing to change the face of virtual retail, pushing the envelope and supporting retailers along their ecommerce transition journey,” said Olga Dogadkina, co-founder and CEO of Emperia. She highlighted the industry’s movement towards collaboration, with each technology vendor leveraging its unique capabilities and traits under Emperia’s virtual environments.
The collaboration with Tommy Hilfiger and the PVH group is an example of this, creating a brand-new digital retail environment that enhances user experience and encourages brand engagement and shopper loyalty by consolidating the fragmented industry into a streamlined experience.
The ultimate goal is to increase ecommerce performance by centralizing the payment process and allowing users to freely navigate across the retailer’s online properties.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,588 | 2,023 |
"Approaching the issue of diversity in the tech industry | VentureBeat"
|
"https://venturebeat.com/virtual/approaching-the-issue-of-diversity-in-the-tech-industry"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Approaching the issue of diversity in the tech industry Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
While the number of women in STEM has steadily increased since 1970 — when they only made up 4% of the industry’s workforce — that number is just 27% today. Deloitte Insights reported that one in four leadership positions at large global technology firms were held by women in 2022.
This all sounds promising, but compared to the overall proportion of women in the workforce, it would be remiss to say this is sufficient. Moreover, only one in 20 of those women in leadership are women of color. So what needs to be done to create more inclusivity and increase opportunities for women in STEM? Pursue and promote an inclusive culture Inclusivity touches every aspect of culture. It can be difficult to know where to start when building an inclusive culture , but it’s important to understand what the overarching goal is: Making all employees feel that they can bring their authentic selves to work and are set up to be successful in their roles. This is an ongoing process that can be supported through a number of strategies, but here are a few that I have found particularly impactful as a mentor, leader and woman in tech.
Articulate a vision for diversity and inclusion Define clear success criteria for what a cross-functional inclusive culture looks like at your organization. Similarly, ensure that everyone — from leadership and hiring managers to interviewers and individual contributors — is aware of how inclusivity and diversity positively affect the bottom line. Making this clear is important to gain buy-in and is often not something easily comprehended. Particularly across global teams, make sure everyone can answer the question, “Why do we at this company care about diversity and inclusion?” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Focus on and emphasize the importance of solid onboarding Set new joiners up for success with a solid onboarding process at every level. Make sure they are introduced to folks cross-functionally, as well as their coworkers. Not only does this drive cross-functional exposure and dissemination of ideas and goals, but it opens up the possibility for people to find more similarities among their peers.
Re-examine your employee training programs Provide training that aligns well with your inclusive culture and articulates well what it means to be inclusive and accepting of others, regardless of background. This is particularly important in global organizations where unique cultures have different traditions and practices. Hold everyone at all levels responsible and accountable for creating and maintaining that inclusive culture by training, re-training, and evaluating practices at a regular cadence.
While establishing and maintaining inclusivity is incredibly important in the drive for representation, it’s only half the battle. Backing up an inclusive culture with a diverse workforce is paramount, and vice versa. Without an inclusive culture, team members from diverse backgrounds won’t be able to do their best work — hence, diversity and inclusion go hand-in-hand.
Organizations need to recognize that upholding inclusivity and increasing opportunities for underrepresented groups such as women in tech requires an ongoing, concerted effort that goes against the grain of conventional practices. Leaders must step outside of their comfort zone and make themselves vulnerable and open to change.
Increase opportunities for women in tech both internally and externally Within any organization, senior leadership must be aware of current demographics and representation, and make certain that diverse voices are present — and even more importantly, heard. This includes all aspects of the employee journey, from hiring to daily interactions to promotions. Strategies for doing this include: Provide a venue for employees from different backgrounds to connect Whether it’s a Slack channel for LBGTQ+ employees, an employee resource group (ERG) for women in tech or a one-monthly lunch with a guest speaker focused on diversity, ensure that there are venues for employees to discuss and raise issues.
By encouraging these group events, companies can provide opportunities for underrepresented employees to network and build each other up. Creating networks and relationships is particularly critical for employees who might be entering their first job or a new role where they’re seeking guidance on career development opportunities.
Tackling diversity: Create clear career development programs In establishing clear career paths, employees from all backgrounds should understand how to advance in their careers. In a similar vein, organizations can work to remove personal biases from promotion decisions. No matter how a company chooses to approach the issue of diversity , it’s essential that underrepresented groups and voices are heard and amplified during the career processes a new employee faces.
There’s no one-size-fits-all approach to improving the lack of women and broader diversity in the tech industry, but it’s essential that we acknowledge and accept that this is an important issue and take steps to end these inequities. It’s on all of us, particularly those in leadership, to work towards making a company culture one that not only possesses diversity but advocates for it and promotes inclusivity.
Colleen Tartow is director of engineering at Starburst.
DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,589 | 2,023 |
"SandboxAQ unveils Sandwich, an open-source meta-library of cryptographic algorithms | VentureBeat"
|
"https://venturebeat.com/security/sandboxaq-unveils-sandwich-an-open-source-meta-library-of-cryptographic-algorithms"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages SandboxAQ unveils Sandwich, an open-source meta-library of cryptographic algorithms Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
SandboxAQ , an AI-driven quantum technology platform, has unveiled “ Sandwich ,” an open-source framework that aims to reshape contemporary cryptography management. As per the company, the platform intends to propel organizations toward cryptographic agility.
It furnishes developers with a unified API, enabling the integration of chosen cryptographic algorithms into applications. According to SandboxAQ, this agility permits adaptation to evolving technologies and threats and mitigates the necessity for code rewrites.
Moreover, Sandwich empowers developers with heightened observability and control over cryptographic operations, fortifying overall cybersecurity protocols.
“The traditional way of managing cryptography has not kept pace with the demands of new technology stacks and agile development practices,” Graham Steel, head of product at SandboxAQ’s quantum security group, told VentureBeat. “Compounding this is the need for greater cryptographic agility to help protect organizations against current and future threats posed by quantum computers. Our API helps make it easy for developers to avoid the mistakes typically made when manipulating cryptography at a low level, and allows audit teams to rapidly verify that cryptography is used according to policy.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Crypto-agile architecture Steel underscored the fact that Sandwich’s abstraction of cryptography from application code engenders a crypto-agile architecture, enabling developers to fluidly update and replace algorithms as needed. The API facilitates cryptography layer updates, ensuring application integrity without the apprehension of disruptions or supplemental coding demands.
The framework incorporates libOQS, streamlining access to novel post-quantum cryptography (PQC) algorithms devised by The National Institute of Standards and Technology (NIST).
Additionally, it supports multiple languages (C/C++, Rust, Python, and Go) and operating systems (MacOS, Linux), providing developers with the flexibility to work in their preferred environment and easily access several popular cryptographic libraries (OpenSSL, BoringSSL), including new post-quantum cryptography (PQC) algorithms from NIST.
“By supporting multiple languages, operating systems and cryptographic libraries, we aim to make it easier for developers to securely implement cryptography into their applications while giving them the flexibility to work in their preferred coding environment,” Steel told VentureBeat. “Cryptographic libraries only offer predefined functions and typically lack flexibility or customization options. Sandwich creates an abstract layer between these libraries and the developer’s preferred programming environment, managed by the Sandwich API.” Streamlining cryptographic security and management Steel asserts that Sandwich expedites the implementation of application-based cryptography by embracing modern DevOps practices.
The framework offers industry-standard protocols, simplifying the adoption and integration of proven cryptographic methods into applications. These methods are available at runtime as cohesive cryptographic objects referred to as “sandwiches.” As per the company, the framework facilitates a three-step process, streamlining “sandwich” creation and reducing implementation complexity. Developers select the desired protocol (TLS 1.3) and the preferred implementation (OpenSSL+libOQS). Sandwich then constructs these components into a Sandwich object, establishing a secure tunnel that interfaces with the application via the Sandwich API.
“Our API helps ensure that the application’s cryptography is implemented correctly and securely, checking newly updated cryptography for configuration errors, performance issues, and vulnerabilities,” Steel told VentureBeat. “It also facilitates crypto-agility by enabling developers to quickly swap out cryptographic libraries as technologies and threats evolve, without having to re-write any code.” Programming flexibility Steel explained that the framework’s abstraction provides programming flexibility and safeguards developers from the intricacies of cryptographic library utilization. Once integrated, the Sandwich framework empowers developers to swiftly and effortlessly update their cryptography through the API, eliminating the need for code rewrites.
He asserts that this approach expedites the transition of applications to production, eliminating bottlenecks in cryptography management.
“Crypto-agility will become a necessity with the emergence of fault-tolerant quantum computers, which will require the adoption of PQC algorithms,” he added. “With Sandwich, developers can take a self-serve approach to implementing cryptography without direct input from cryptographers or other security experts. We aim to enable developers to quickly swap out cryptographic libraries as technologies and threats evolve — without having to re-write any code and help ensure that the application’s cryptography is implemented correctly and securely, checking newly updated cryptography for configuration errors, performance issues, and vulnerabilities.” Steel claims that Quantum computers’ ability to break public-key encryption will necessitate a global shift to NIST’s new post-quantum cryptography (PQC) algorithms to protect sensitive personal, business and government data.
Extended access to PQC algorithms Steel emphasized that incorporating the libOQS library into Sandwich extends developers’ effortless access to NIST’s PQC algorithms. This facilitates experimentation with the integration of cutting-edge cryptographic techniques at the application level, enabling the identification of the optimal balance between security and performance.
“Fully transitioning an organization to PQC and implementing crypto-agility could take years, depending on the size and complexity of the organization’s IT infrastructure,” said Steel. “However, by building crypto-agility directly into their applications, organizations can get a head-start on their PQC transition and strengthen this key element of their overall cybersecurity posture.” SandboxAQ also announced that it has launched its Security Suite, which handles the discovery and remediation of cryptographic vulnerabilities through crypto-agile encryption management.
Faster, easier transition to PQC The company claims that a broad range of U.S. government agencies and enterprises are already using Security Suite — including the U.S. Air Force, the Defense Information Systems Agency (DISA), the U.S. Department of Health and Human Services, SoftBank, Vodafone, Cloudera, Informatica and several other global banks and telecommunication providers.
SandboxAQ also highlighted its internal use of the Sandwich library across multiple dimensions, catalyzing research and development efforts while infusing crypto-agility into its products.
“Our framework makes it easy for organizations to swap cryptographic elements, and the API ensures that they’re not overlooking any crucial steps that would make their applications — and their organization — more vulnerable to cyber-attacks,” Steel told VentureBeat. “By embedding a crypto-agile architecture into their applications, developers can help make their organization’s overall transition to PQC easier and faster.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,590 | 2,023 |
"Cosmic Wire raises $30M to expand cross-chain Web3 platform | VentureBeat"
|
"https://venturebeat.com/security/cosmic-wire-raises-30m-to-expand-cross-chain-web3-platform"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Cosmic Wire raises $30M to expand cross-chain Web3 platform Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Cosmic Wire , a Web3 and blockchain solutions technology company, announced that it has successfully completed its series seed round, raising $30 million in funding. The investment round was led by prominent investors Solana Foundation and Polygon, positioning Cosmic Wire as the first-ever cross-chain funded Web3 company.
The company intends to use the funding to accelerate the development of a decentralized, transparent and secure digital Web3 ecosystem. It plans to expand its Web3 ecosystem through cross-chain technology, facilitating data transfer and interoperability across diverse blockchains.
The company said its primary objective is to give users complete control over their Web3 data and online interactions, fostering trust and individual sovereignty.
“As a startup, we focused entirely on our vision and building working prototype systems that were sorely missing in Web3 and what we considered to be natively necessary for mass adoption,” Jerad Finck, CEO and founder of Cosmic Wire, told VentureBeat. “This investment will allow us to scale and deliver our products and technology ready for mass adoption.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Finck stated that his company’s philosophy contrasts sharply with other platforms, as it does not involve trading access to users for ownership. Instead, the company embraces a decentralized approach through cross-chain technology, ensuring interoperability and security “in a true zero-trust environment with no backdoor access, and transparent as all data usage is containerized and time-stamped on the chain. It is, simply put, algorithmic systems built to stabilize and scale with transparency, security and accuracy,” said Finck. “The chain is the internet; the people are the nodes.” The company is constructing its Web3 infrastructure on the Solana network. It asserts that its metaverse SDK solutions substantially decrease development time for high-fidelity, 3D, browser-based metaverse experiences. These solutions also integrate ecommerce, digital products, content CDNs, payment methods and avatar user-generated content (UGC).
Streamlining blockchain access through cross-chain technology Finck emphasized that at its core, blockchain functions as an immutable ledger. “We have placed all processes on the chain, including commodity trades, tokenized access, API connections and basic system functions. These processes are driven by an access layer of soul-bound identity, which creates localized ecosystems capable of algorithmically scaling existing and proven processes,” Finck explained to VentureBeat.
He clarified that his company does not radically alter the foundation of operations. Instead, it focuses on leveraging existing systems more effectively and efficiently. He asserts that the company’s cross-chain technology can benefit every Web2 or preexisting system, as it centers on the core values of accuracy, security and transparency in these functions.
The company secured a significant seed round, even amid a challenging market, with leading entities as key investors. Its toolkit has already been licensed into global infrastructure for widespread adoption.
“We aim to revolutionize business practices, communication methods and asset protection, providing users with true ownership rather than mere user status. The entire cross-chain software ecosystem we’ve developed revolves around these core principles,” Finck said.
He highlighted that users will experience a fundamental shift in the cross-chain system, evolving from mere users to empowered owners. Individual ownership is empowered through a global mesh network, enabling trustless interactions within a fully algorithmic and transparent approach, facilitating streamlined and scaled connections.
In addition to the funding milestone, Cosmic Wire has secured participation in Google Cloud’s eagerly awaited Web3 startup program, even before its official launch. Through this program, the company will gain exclusive access to customized resources, including a substantial allocation of Google Cloud credits for two years, unprecedented entry into Google’s Web3 ecosystem and a range of complimentary benefits.
“We did not build into verticals; we built an ecosystem that can be applied virtually anywhere,” said Finck. “We are already in fields from military development to finance, medical to education, entertainment to fantasy; it is not about the genre but about how the tool can be used and implemented to new degrees of extrapolation and efficacy that have not been achieved until now.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,591 | 2,023 |
"Why gender parity in hiring and promotions is crucial to the bottom line | VentureBeat"
|
"https://venturebeat.com/programming-development/why-gender-parity-in-hiring-and-promotions-is-crucial-to-the-bottom-line"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Spotlight Why gender parity in hiring and promotions is crucial to the bottom line Share on Facebook Share on X Share on LinkedIn Presented by Skillsoft Today, staying competitive while the economy is in constant flux requires a diverse workforce — and addressing the lack of women in tech roles is key. In this VB Spotlight, learn why gender parity directly impacts the bottom line, and how to find and support talented women, and more.
Register now to watch this free, on-demand webinar! First there was the Great Resignation, alongside high turnover rates and major demographic shifts in the working population; it follows that the shortage of skilled talent continues, especially in the technology industry. The struggle to fill crucial job roles dovetails neatly with the issue of diversity in the labor force. Women are historically underrepresented in tech fields — but why are qualified women still being overlooked? In this VB Spotlight, Kelly Deich, executive director, learning and development & chief learning officer at security company ManTech, and Codecademy’s Koma Gandy, VP head of tech and dev content, dove into the need to get more women into the technology field, and help them grow and succeed, why it makes business sense , and more.
Key challenges and barriers One of the key challenges for women, not just in the technology world but across roles and industries, is representation, says Deich. It starts initially with affinity bias, and then compounds from there. The tendency as humans is to socialize with, hire and promote people who are like us — similar appearance; similar experiences. So, in an industry that’s already overwhelmingly male, diversity in hiring becomes a real challenge.
“We need to start challenging that affinity bias within our organizations to focus on finding employees that are going to bring diverse ideas to the table, allowing us to expand the aperture of the types of people that are involved in this field to include women and people of color,” she says.
There’s a pay gap, but also a promotion gap, she adds, with only 86 women promoted into manager roles for every 100 men. Lack of recognition is one of the major factors behind high turnover rates, and companies lose the diverse perspectives that are crucial to innovation and staying competitive.
On top of that, the COVID pandemic saw almost 2 million women leave the workforce.
Companies need to create opportunities for non-conventional returns to work and provide women with tools so they can re-enter the workforce and be successful.
A lack of insight and hard numbers around the extent of the issue is another challenge. Companies need to feel comfortable in collecting metrics, Gandy says.
“We need to understand the places where we’re seeing women either getting stuck, disappearing or not being represented in the opportunities that lead to more senior roles in an organization,” she explains. “You have to be able to find and root out pockets of unconscious bias in the organization, whether it’s with hiring managers who may be accidental gatekeepers, or talking and working with managers who are looking for people and trying to cast their net wide to find people who are suitable for promotion.” Addressing systemic and organization biases Putting processes and policies in place mitigates the impact of bias on finding great candidates for more senior opportunities, and identifying qualified internal employees to advance, take on more responsibility, or even move into brand-new roles. And companies with an inclusive business culture and policies dramatically increase profitability, productivity and innovation and gain an enhanced reputation, which in turn enhances its ability to attract and retain talent.
These intentional processes, robust professional development and advancement programs need to address both systemic issues as well as organizational culture. At ManTech, the annual review process has been replaced by a career enablement program, Deich says, to promote regular self-reflection and engagement with managers, so that employees are able to personally tailor their career paths.
“I think that’s a way that we can start to address some of those gender inequalities, because it’s not a one-size-fits-all recipe for what advancement looks like,” she says. “Every individual is going to have a different way to get to where they want to be.” Increasing the external pipeline is also crucial, such as establishing internship programs and actively seeking out diverse candidates instead of waiting for them to apply, and so on.
“Getting talent early, developing them, helping them stay and see a place for themselves in the future — these are all areas where we can make an impact in bringing a more diverse worker population,” says Diech.
Looking for qualified candidates internally — ones with the potential to grow into new roles and step into larger roles — is increasingly essential as well. That means developing quality upskilling and reskilling programs that embrace how the world of work has changed.
Upskilling and reskilling internal candidates Internal candidates are a goldmine of company knowledge and savvy. Upskilling and reskilling both prepares employees to be more effective in their current roles, but also advance or even take on new responsibilities and fill in a company’s skill gaps. They also address the changes in the way employees engage with their work today, with the explosion in remote work and geographically dispersed teams, and an increasingly digital world. As a result, career trajectories across the board, tech and non-tech alike, have transformed.
“We need to be able to create educational programs that are reflective of where that individual is in their own career, what that career looks like and where it’s going,” Gandy says. “Once upon a time people thought coding was just coding. I’m going to sit down, learn a programming language, tap a keyboard, and that’s what tech is. That has evolved, especially over the last several years and post-COVID. Everyone is touching some aspect of tech and being able to figure out how to empower people with tech – perhaps taking some of those notions of what tech is and refreshing them with what tech can be.” In other words, empowering people with knowledge and introducing them to concepts that can help them in their career arc. But technology skills don’t operate in a vacuum, Deich says. You need to know your way around technology, but also develop what are often called soft skills. Ultimately, there is a learning opportunity for everyone.
“We really need to see both of those being available, and having opportunities to bring them together and show how they connect,” she says. “Being able to see each other’s viewpoints and seeing the value in having that well-rounded background will help advance whatever it is we’re trying to do from a business perspective.” To hear more about the pivotal role gender equity plays in a company’s success, how to change an organization’s culture from the inside out, why upskilling, mentoring and sponsorship programs are important – and how to effectively tailor them to your employees, don’t miss this VB Spotlight event.
Register to watch free on-demand! Agenda The critical need to uncover and address gender inequities now Enabling women with onboarding, reskilling and upskilling to future-proof company strategy Benchmarking to uncover inequities and create a robust learning program Optimizing and implementing equal opportunity curriculums Presenters Kelly Deich, Executive Director, Learning and Development & Chief Learning Officer, ManTech Koma Gandy, VP Head of Tech and Dev Content, Codecademy Carrie Goetz, Amazon Best-Selling Author (Moderator) The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,592 | 2,023 |
"The critical need to uncover and address gender inequities in tech | VentureBeat"
|
"https://venturebeat.com/programming-development/the-critical-need-to-uncover-and-address-gender-inequities-in-tech"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Spotlight The critical need to uncover and address gender inequities in tech Share on Facebook Share on X Share on LinkedIn Presented by Skillsoft Gender inequity in the tech industry has a direct impact everywhere from daily business to the bottom line. In this VB Spotlight, learn why it’s crucial to encourage women to pursue and advance in tech roles to stay competitive.
Register to watch free on-demand! In 2020, the Global Gender Gap Report found that based on efforts at the time, it would take a century to achieve gender parity. But now the Covid-19 pandemic threatens to set that figure back yet another decade.
It’s not only that pandemic job cuts disproportionately affected women across industries, but women — and their careers — also bore the brunt of urgent caregiving needs. Plus, economic slowdowns, such as the one triggered by the pandemic, also push gender equality efforts far down organizations’ priority lists — even as companies scramble to find talent for the increasing number of empty or understaffed tech roles.
“Women need to find meaningful careers, as well as opportunities to advance in their current roles, especially in the face of changing workforce demands,” says Koma Gandy, VP head of tech and dev content, Codecademy. “But it’s much harder for women to find mentors and champions in the organization who will put them in the right conversations and make them available for significant career-changing opportunities.” It’s a two-pronged issue, she adds. Women who want to advance may not know how or where to go to get that guidance or support, but on the other side is the organization itself.
“Leaders must be willing to look in the mirror and say, ‘Am I doing everything I possibly can to make an equal and level playing field?’” she says. “There is a lot that can be done to make sure the burden is not being placed on women. If I am in a senior leadership role, it is my duty to the organization to examine unconscious biases when identifying the right candidate for the right job, and ensuring they’re doing everything they can to create an opportunity where everyone in the organization can be equally successful.” Measuring the impact of inequities To address gender inequality, it’s crucial to identify metrics that not only expose inequities, but add a measurable component to solutions that address these issues, Gandy says – and tech organizations, which thrive on data, tend to have a good handle on metrics. It’s also a way to make the situations and elements that hinder the progress of women more real and tangible to stakeholders.
“It can be something as simple as looking at how many women are represented, from the most junior to the most senior levels of the organization, where specific break points appear, and where an unusual number of people are not making the jump from one level to the next,” she says. “That’s where the conversations start, and not just within leadership level, but with these women themselves.” Here’s where the next phase of data comes in, when women employees share their own perspectives – whether they feel prepared to take on opportunities of greater seniority, or if they have the support to do so. Whether they feel as if they’re missing the skills and learning they need to build the confidence, or whether they need access to the kinds of conversations that lead to high-profile projects, or more opportunities to participate in bigger ways.
“The most fundamental question is, why are women leaving, and why are women not advancing?” Gandy says. “Being willing to listen helps uncover problems that are often hidden and not considered, but are in fact tremendous stumbling blocks.” The issues could be easily changed, such as a restrictive maternity or parenting policy, or support for the time, money and stress that elder care requires, and putting more flexible policies in place. Pockets of unconscious bias throughout the organization are often a culprit that create barriers in the hiring process, which inadvertently weed out women candidates, or mean that women do not come up in conversations about advancement. And when new skills are required, turning to upskilling and reskilling women to help fill those gaps.
“It’s about making sure that those who are in key decision-making roles recognize some of the things that might hinder them from finding highly qualified, highly motivated women to advance in their organizations,” she says.
The need for upskilling and reskilling The digital transformation that has swept every organization means that employees in every role are being called on to use brand-new technology tools and platforms in sophisticated ways. Reskilling and upskilling is a way for organizations to ensure employees stay abreast of changes and new demands in their roles. Upskilling also means delving into a known talent pool when a new role appears, rather than having to trawl through an incredibly competitive hiring market.
A robust learning program also has to recognize that every role requires a combination of hard and soft skills. An employee must demonstrate their mastery of certain technologies, concepts and frameworks, but organizations need to recognize that these tech skills are not the be all, end all. If someone is moving into a role where they’ll have to facilitate cross-functional teams, how do you give them the tools to do that effectively? How does a new manager grow in their role? “It’s about recognizing that it’s the combination of hard skills, like applying concepts and technologies, with soft skills, such as managing and working with people, that make organizations more effective,” Gandy says. “Then putting that together in a package so employees feels like they have access to what they need, in a continuum that makes sense, and helps them advance their careers wherever they are at that point. Plus it also provides opportunities for jumping off points.” To learn more about the competitive advantage of gender equality in an organization, how to uncover and address inequities, build a skilling program and more, don’t miss this VB Live event! Register to watch free on-demand! Agenda The critical need to uncover and address gender inequities now Enabling women with onboarding, reskilling and upskilling to future-proof company strategy Benchmarking to uncover inequities and create a robust learning program Optimizing and implementing equal opportunity curriculums Presenters Kelly Deich , Executive Director, Learning and Development & Chief Learning Officer, ManTech Koma Gandy , VP Head of Tech and Dev Content, Codecademy Carrie Goetz , Amazon Best-Selling Author (moderator) The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,593 | 2,023 |
"Smart contracts might not be as smart as you think | VentureBeat"
|
"https://venturebeat.com/programming-development/smart-contracts-might-not-be-as-smart-as-you-think"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Guest Smart contracts might not be as smart as you think Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Blockchain technology has piqued the interest of enterprises worldwide. Its advantages, including immutability and transparency, have led legacy companies outside of finance, such as BMW and Bosch, to experiment with smart contracts to create more efficient supply chains and make smarter engineering products.
Smart contracts, which are essentially software coded into a specific blockchain, formalize and execute agreements between multiple parties, removing the need for a trusted third-party intermediary, saving time, and allowing a multi-party consensus-based validation. They can be used across a variety of activities, such as wills, chess games and even transferring deeds.
But despite all the disruptive potential and the highly-touted capabilities blockchain promises, the number of heists targeting smart contracts has risen more than 12-fold over the last two years. If they are so smart, why are we seeing such a massive uptick in heists? To better understand, let’s clarify the relationship between blockchain and smart contracts.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Decentralization Think of a blockchain network like Amazon’s AWS platform and each one of its smart contracts as a server. With blockchain, there isn’t a single centralized server for hackers to exploit, making it more difficult for cybercriminals to use traditional hacking methods, such as Trojan horses, physical attacks and ransomware.
Blockchain counters these by eliminating a network’s single point of failure.
While a blockchain network can’t exactly be hacked, many distributed apps and smart contracts that blockchain facilitates can.
Thanks to the gradually growing success and influence of decentralized finance (DeFi) , large amounts of value are being funneled through smart contracts, making them appealing to hackers. And this threat will likely only grow as more assets move on-chain with the rise in tokenized real-world assets. Hacking poses a serious threat to this burgeoning blockchain sector because assets nicked from smart contracts are extremely difficult to recover.
Threats to smart contracts Like all code, smart contracts are subject to human error. These errors can come in the form of typos, misrepresentations of specifications, or more serious mistakes that can be used to hack or “trick” the smart contract. As opposed to blockchain, there is no guarantee that the contracts have been peer-reviewed or validated.
While faulty coding may be avoided by a smart contract audit, other threats are more complex. The default-visibility vulnerability, for example, is a common mistake that occurs when the visibility of functions is not specified and certain functions are left public. For example, hackers could access the mint function and create billions of relevant tokens. Fortunately, this vulnerability can be prevented by running an audit that ensures all functions are set to private by default.
Another more complicated and serious threat caused by coding errors is a reentrancy attack.
This happens when an attacker takes advantage of the smart contract’s external function calls and deploys a malicious smart contract to interact with the one holding the funds.
In 2016 the DAO incident , which occurred in the early days of Ethereum, demonstrated just how dangerous this type of attack can be and, ultimately, led to the creation of Ethereum Classic. Preventing reentrancy attacks isn’t simple, but there are frameworks and protocols that can mitigate the damage, which include CEI (check, effects and interactions), reentrancy guards and more.
If you’re competent in smart contract code, reading the code itself is always a massive advantage. Just as reading a contract before moving into a new apartment protects you from any surprises, being able to read a smart contract’s code can reveal flaws, malicious functions, or features that don’t work or make sense.
However, if you are an end user who is not particularly tech-savvy, use only smart contracts with publicly accessible code that are widely used. This, as opposed to compiled smart contracts, where the code is hidden and people are unable to review it, is the preferred option.
Addressing smart contract vulnerabilities Let’s not forget that most smart contract administrators leave themselves some admin privileges, usually to make post-launch changes. To access these privileges, the admins need to use their private keys. These private keys are yet another vulnerability, and if they are not custodied correctly (i.e., in an offline cold vault), hackers who somehow gain access can make changes to the smart contract and funnel the funds anywhere they wish.
Lately, the European Parliament mandated a kill switch mechanism be employed to mitigate damage in the event a smart contract is compromised. While the intention of the regulators was to give people more protection over their own personal data, the act has generated concerns in the Web3 community.
If not implemented correctly, a kill switch could destroy the entire smart contract and any value stored on it. A better implementation would be to activate a pause function which, in the event of a security threat, could freeze the smart contract and reactivate it once the issue is resolved.
Should the pause function be implemented, it’s advised that the admin utilize two different private keys. Because once the private key (used to pause the contract) goes online, it becomes vulnerable to attack. As mentioned in my article on the mandate, separating the pause and unpause admin keys and storing them offline strengthens the smart contract’s security by eliminating potential points of failure.
As with all technologies, security threats exist in the DeFi and blockchain ecosystems. Smart contracts certainly have their advantages, as we’ve seen with the emergence of DeFi platforms and protocols, but understanding their vulnerabilities, doing diligent research and following the guidelines set forth in this article can help mitigate them. With time, enhanced security protocols will take shape, strengthening smart contract use cases and ushering in a more robust blockchain ecosystem.
Shahar Shamai is CTO and cofounder of GK8.
DataDecisionMakers Welcome to the VentureBeat community! DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own! Read More From DataDecisionMakers The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! DataDecisionMakers Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,594 | 2,023 |
"AWS, Meta, Microsoft-backed Overture Maps Foundation releases first 'open map' dataset | VentureBeat"
|
"https://venturebeat.com/data-infrastructure/aws-meta-microsoft-backed-overture-maps-foundation-releases-first-open-map-dataset"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AWS, Meta, Microsoft-backed Overture Maps Foundation releases first ‘open map’ dataset Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Overture Maps Foundation (OMF) is a cooperative effort founded in December 2022 by Amazon Web Services (AWS), Meta, Microsoft and TomTom to provide high-quality geolocation data and mapping for use in the various companies’ apps and other enterprises, too. This allows them to break free of Google Maps’ API charges and also to create a new dataset that they can control, rivaling the volunteer, crowdsourced OpenStreetMap , from which they have drawn some of their underlying map data.
Today, the foundation is releasing its first global map dataset, called “ Overture 2023-07-26-alpha.0.
” The debut includes four unique map layers: Places of Interest (POIs) with nearly 60 million locations. This is “a major, previously unavailable open dataset, mapping everything from big businesses to pop-up street markets worldwide,” said OMF executive director Marc Prioleau in an email to VentureBeat.
A Buildings layer with more than 750 million building footprints.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! A Transportation Network layer which includes a “worldwide road network” sourced from OpenStreetMap but upgraded with new features to allow the addition of speed limits and traffic rules.
A Geopolitical Boundaries layer showing borders and political jurisdictions, with translation support for 40-plus languages.
OMF says the data has undergone a rigorous series of quality checks and been validated, and is now available for free public download on OMF’s website under an Open Data Commons Open Database license (ODbL) or CDLA Permissive v2.0 license, depending on the layer. This allows the data to be used for commercial purposes by map builders or location service providers. The foundation is seeking public feedback on Github and at [email protected].
Prioleau called it “a significant step in establishing a comprehensive, market-grade open map dataset for our constantly changing world.” A new mapping alliance shows power of collaboration Since its founding a year ago, OMF has grown to include more than a dozen mapping, geospatial and technology companies. New members of the collaboration include ESRI, Cyient, InfraMappa, Nomoko, Precisely, PTV Group, SafeGraph, Sanborn and Sparkgeo. The OMF is still seeking new members through its onboarding process.
But why would all these companies partner with one another — especially since some are rivals in certain sectors — to create a shared mapping resource? Because the challenge of maintaining a consistently updated, highly detailed world map is too great for any one company or organization, according to Priloleau. “The costs and complexities of collecting and maintaining global map data [are] well beyond the capability of any single entity,” he wrote to VentureBeat via email.
Prioleau said the collaboration was especially critical in the Places map layer, as it was “built through contributions from Meta and Microsoft, demonstrating the power of collaboration. The ultimate goal for places data is a complete dataset of places in the world with an efficient feedback cycle that can update this dataset as the places in the world change.” Maintaining the most consistently updated, best world map By and large, most people expect the physical, built world to remain largely static. But in practice, that’s hardly the case, and accurate digital maps need to reflect that dynamism if they are to be reliable.
“Approximately 20-25% of businesses turn over every year; old businesses close and new ones open daily,” Prioleau said. “In the COVID years, this number was likely much higher. The key to building an updated places dataset is the constant refreshing of the data through real-time signals.” Prioleau outlined how OMF aims to keep up with this ever-evolving landscape: with user-generated reporting.
“Overture’s underlying quality philosophy is that map data quality improves as map services built on this data are deployed to more users who, in turn, provide feedback on the accuracy and completeness of the data,” he said. “While the absolute number of users is important, so is the variety of use cases. A social media site will use map data for different use cases than a logistics application or a local search application. By establishing a broad base of users, we believe that we can build the best map.” Where the map leads… Going forward, OMF plans to continue updating its map and also to add a new dataset: the Global Entity Reference System or GERS.
This is a way to identify buildings and static infrastructure features, like segments of roads, beyond the address-level layer or latitude-longitude coordinates. Essentially, it is similar to a mobile device universal user identification (UUID), according to Prioleau.
No timeline has been given for the addition of a GERS ID to the OMF map, but Prioleau argued that once this information does begin to appear on the map, it will enable a whole new class of applications and experiences for end users of the companies involved.
For example, a company could “combine information about a restaurant (opening hours, credit cards accepted, etc.) with social media content (reviews, ratings) and footfall information that shows the level of activity,” he said. “The GERS ID is the link ensuring that all those types of data refer to the same business. It eliminates ambiguity that can happen when trying to refer to a place on the map.” The alliance and its products are new, so it remains to be seen how well the project holds up over time, but it’s clearly off to a strong and ambitious start on its journey to creating a new, commercial-grade guide to our world.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,595 | 2,023 |
"Neuralink begins accepting human patients for brain implant trials | VentureBeat"
|
"https://venturebeat.com/automation/elon-musks-neuralink-begins-accepting-human-patients-for-trials-of-its-brain-implant"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Elon Musk’s Neuralink begins accepting human patients for trials of its brain implant Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Do you want to put an implant designed by Elon Musk ’s company Neuralink — perhaps best known for killing 1,500 test animals — into your brain? Are you at least 22 years old and do you have quadriplegia (loss of function in four limbs) from a spinal cord injury, or amyotrophic lateral sclerosis (ALS)? Then you may qualify to participate in the first-ever volunteer human trials of Neuralink’s first brain-computer interface, which has begun recruitment for participants, as the company announced on its website today.
“The PRIME Study (short for Precise Robotically Implanted Brain-Computer Interface) – a groundbreaking investigational medical device trial for our fully-implantable, wireless brain-computer interface (BCI) – aims to evaluate the safety of our implant (N1) and surgical robot (R1) and assess the initial functionality of our BCI for enabling people with paralysis to control external devices with their thoughts,” explains the blog post.
The company has courted controversy for testing its implant on monkeys that allegedly resulted in their death ( Musk has posted on his social network X , formerly Twitter, that the monkeys were terminally ill, anyway), but that apparently isn’t stopping it from moving forward to try the tech on humans, next, after receiving an exemption from the U.S. Food and Drug Administration in May.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! What’s involved in the Neuralink implant human trials? According to the blog, “During the study, the R1 Robot will be used to surgically place the N1 Implant’s ultra-fine and flexible threads in a region of the brain that controls movement intention. Once in place, the N1 Implant is cosmetically invisible and is intended to record and transmit brain signals wirelessly to an app that decodes movement intention. The initial goal of our BCI is to grant people the ability to control a computer cursor or keyboard using their thoughts alone.” In other words: let us use our surgical robot to install this implant in your brain so you can control a computer with your mind — it won’t show up on your head, we promise.
A brochure posted by Neuralink goes into more detail about the PRIME trials, writing: “The N1 Implant records neural activity through 1024 electrodes distributed across 64 threads, each thinner than a human hair.” It also includes an “exploded view” diagram of the device here: And it explains that there is an “N1 User App” that Neuralink has created, allowing the user to actually control computers with their thoughts.
The brochure further reveals the trial will last “approximately 6 years” and that participants will need to make themselves available for “regular follow-ups” with Nerualink’s “team of experts.” Some of the follow-ups will occur at clinics, while others will take place at the patients homes, including hour-long sessions twice weekly.
Who is eligible? As alluded above, the study is for now only open to those 22 and older who “have quadriplegia due to cervical spinal cord injury or amyotrophic lateral sclerosis (ALS).” However, those patients who already have an “active implanted device (pacemaker, deep brain stimulator (DBS), etc),” have experienced seizures in the past, who need to undergo MRIs or who are receiving transcranial magnetic stimulation (TMS) treatment are not eligible.
As for what patients get out of it — other than potentially gaining a new way of interacting with the world and being part of a pioneering medical study, for better or ill — Neuralink says they “will be compensated for study-related costs (such as travel expenses to and from the study site).” If successful, Neuralink or other competing BCIs could herald new ways for patients with mobility issues to communicate and might one day become the preferred method for controlling computers and digital devices for the general populace.
The day after the announcement, Neuralink founder Musk tweeted that he believed in addition to helping paralyzed people move again, the company also hopes to reduce the risks of AI to human civilization by improving “bandwidth” between humans and AI.
The first human patient will soon receive a Neuralink device. This ultimately has the potential to restore full body movement.
In the long term, Neuralink hopes to play a role in AI risk civilizational risk reduction by improving human to AI (and human to human) bandwidth by… https://t.co/DzqoYI27Ng VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,596 | 2,023 |
"Why generative AI just hits different – and why organizations need to embrace it now | VentureBeat"
|
"https://venturebeat.com/ai/why-generative-ai-just-hits-different-and-why-organizations-need-to-embrace-it-now"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Event Why generative AI just hits different – and why organizations need to embrace it now Share on Facebook Share on X Share on LinkedIn Generative AI isn’t just a new craze or a shiny toy; it’s set to bring about major macroeconomic effects — including adding $7 trillion to the global GDP and lifting productivity growth by 1.5%. In other words, it’s not going to fall out of popularity — it’s going to become an essential tool for companies across industries.
Sav Khetan, senior director of product strategy at Tealium, spoke at Transform 2023 about why gen AI it’s important, how it’s making a difference, and how business leaders should be considering it for their own organizations.
“The magic we’re all feeling and experiencing right now is because AI is suddenly able to communicate directly in our language, both in and out,” Khetan said. “It can understand what we say with full context and it can respond with language and images that we can understand. That’s what flipped.” How did this happen, and why now? AI has been integrated into our lives for a long time, but generative AI just hits different. Previously it was very specific and purpose-driven, but large language models (LLMs), the backbone of generative AI, have rewritten the script. The “4” in GPT-4 is a way of describing how much complexity and scale the model can handle. GPT-3 was 175 billion parameters, and the newly available GPT-4 is 170 trillion parameters.
“What this shows you is the scale at which these models are operating,” he said. “This is the reason why the 80-year effort suddenly clicked into place. These models were able to access the internet at large in the last couple of years. It turns out that they needed that much data to figure this out.” They can consume both structured and unstructured data, which is where the game especially changed, since there’s more than 80 percent unstructured data in the world, primarily video.
The impact on UX and how we interact with technology is profound, he adds. Up until now, UX design has been about translating human commands into action, requiring structured, organized, tagged data and operational code. LLMs can consume data in its current form, without any translation necessary.
“If you play this out, if you look ahead, this has the opportunity to completely change how we interact with data directly,” he said. “What if there was no software in the middle? I realize that LLMs in some way are machines themselves, but our relationship with data that we consume for ourselves – for research, for analysis, for insights – now has the power to completely change.” Where generative AI is headed Back in February, ChatGPT became the fastest growing platform of all time. The pace of generative AI innovation and adoption isn’t slowing down any. Gartner predicts that by 2026, 50% of all sales and marketing providers will incorporate assistants , and 60% of design process by new websites will be by generative AI. At the front end, 30% of HR software will use the assistants. And by 2025, 75% percent of digital marketing communications will have avatars.
But those use cases are almost certainly going to evolve into even more powerful applications as stage two, or wave two, as venture capital firm Andreessen Horowitz (a16z) calls it. We’re still in what Khetan called a “pull world,” in which we’re asking AI for responses. Synthesis or synth AI is when the AI automatically looks at the data and tells us what it sees, and can be set up at any cadence we want.
“Now imagine the power of decisions you could make,” he said. “If you could genuinely consume the data that you want from the research that you want to do, for the analysis and insights that you want, your decisions just get that much better and that much more powerful. That’s why we’re so excited about this future. So yes, please take this seriously. Please do your work.” Getting started with gen AI now “Everybody is asking, what do we do to get started?” Khetan said. “It’s simple. Learn. Explore. Play with it. Spend time with it. Nobody can explain this to you. You have to go experience it for yourself.” Right now all these tools, such as DALL-E, Midjourney, ChatGPT and so on are free, but there’s no guarantee how long that will last. Now’s the time to experiment.
You also need to start preparing your data, your APIs, and your systems so that they can connect with these tools, and access the data they need. And you cannot embrace AI in your workplace without as much first party data as possible, because it’s the main input for your AI workflows, if you’re going to make it customer facing.
“If you’re not already doing it, prioritize it,” he said. “If you’re not already planning it, start having those meetings. This is the time.” The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,597 | 2,023 |
"Elevating customer experience: The rise of generative AI and conversational data analytics | VentureBeat"
|
"https://venturebeat.com/ai/the-future-of-personalization-how-generative-ai-is-elevating-customer-experiences"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Elevating customer experience: The rise of generative AI and conversational data analytics Share on Facebook Share on X Share on LinkedIn Illustration by: Leandro Stavorengo Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
This article is part of a VB special issue. Read the full series here: Building the foundation for customer data quality.
The rapid advancement of artificial intelligence (AI) and machine learning (ML) technologies is pushing the boundaries of what can be achieved in marketing, customer experience and personalization. One important development is the ongoing evolution of generative AI (gen AI), which is bringing open-source platforms to the forefront of sales. As the digital-first business landscape grows increasingly complex and fast-paced, these technologies are becoming indispensable tools.
Across industries, engagement models are undergoing significant transformations, as customers expect to access products and services anytime, anywhere and in every possible way. While customers still value a balanced combination of traditional, remote and self-service channels, there is a noticeable surge in their preference for online ordering and re-ordering in the post-pandemic era.
To address these escalating demands, achieve e-commerce excellence across the entire customer journey, and improve hyper-personalization, Big Tech and SMB players alike are making major investments in generative AI innovations.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Producing fresh, original content In contrast to traditional AI approaches that depend on predetermined rules and datasets, generative AI can produce fresh and original content. This cutting-edge technology uses intricate neural networks to discern patterns and generate distinct outputs — a whole new way to generate recommendations and offers.
Using conversational data analytics , businesses gain valuable insights into customer preferences, sentiments and pain points. They can use these insights to further refine products, tailor marketing campaigns and provide better customer support.
In today’s highly competitive and fast-paced digital world, personalization is the preferred strategy for brands seeking to stand out amidst the marketing noise. Effective consumer personalization is the secret ingredient, enabling tailored content and experiences that cater to individual tastes and desires. This amplifies customer experience, enhancing loyalty and retention and increasing return on investment (ROI).
By harnessing generative AI businesses can rapidly create highly targeted content that resonates with their audiences. A prime example is Spotify. The platform uses gen AI to analyze user listening patterns and preferences, then generates curated playlists and provides personalized music recommendations, ensuring that users remain engaged.
Availability of dynamic offerings According to Beerud Sheth, CEO of AI-based conversational engagement platform Gupshup , companies ranging from Amazon to Netflix have long utilized AI in various forms to provide recommendations based on our past purchasing or viewing history. But the advent of gen AI has created a surge in the availability of dynamic offerings.
“ Generative AI can be used to create and target marketing campaigns based on a variety of factors, such as customer demographics, interests and purchase interactions,” Sheth said. “This can help businesses to reach the right customers with the right message and increase the chances of a conversion.” Likewise, Sreekanth Menon, VP and global leader of AI/ML services at Genpact , said that with generative AI, the landscape of hyper-personalized customer experience (CX) is poised to attain new levels of agility.
“The emergence of cloud-led advanced analytics technologies has allowed enterprises to capture insights from omnichannel customer contact points more efficiently,” Menon told VentureBeat. “Capturing, curating and analyzing sentiment with AI/ML across customer conversations amplifies organizational efforts to reach, react and recalibrate their businesses as per the demands of their customer quickly.” Conversational data analytics for targeted campaigns Integrating generative AI data with conversational data analysis has emerged as another powerful method for businesses to identify intricate patterns and trends.
For example, when a user engages with a brand’s chatbot powered by a large language model (LLM), conversational data is stored in the cloud. Later, this data can be analyzed using sentiment analysis to gain insights and understand consumer preferences and pain points.
Gupshup’s Sheth added that analyzing conversational AI data enables the identification of common customer questions and concerns. This valuable information can be used to create more comprehensive and informative FAQs or develop chatbots capable of automatically addressing these inquiries.
He highlighted that the data plays a crucial role in tracking customer satisfaction levels and acquiring insights into customer preferences. This process, in turn, enables companies to enhance personalization and create new products that cater to specific customer needs.
Hyper-personalization Gupshup recently worked with the Dubai Electricity and Water Authority (DEWA), whose gen AI chatbot provides 24/7 customer support and assists customers in finding answers to common questions and requests such as billing inquiries, outage information and service requests, Sheth explained.
Likewise, California-based end-to-end video commerce platform Firework recently introduced its generative AI sales assistant to accompany its core video commerce offering. The patent-pending technology allows customers to use the in-video chat feature on an ongoing, on-demand basis.
“Long after a live stream has concluded, shoppers can ask questions about the products or services featured therein, and our proprietary AI engine will provide accurate, real-time responses based on user input, the content of the video and other associated metadata,” Jerry Luk, Firework cofounder and president, told VentureBeat. “Our AI engine makes use of an LLM that can understand and respond in a wide range of languages and can be customized to reflect each brand’s unique voice.” Luk said that with the integration of gen AI and conversational data analysis, his company saw a significant boost in interactions with customers online.
Conversational data analysis combined with generative AI “allows us to analyze conversational data in real time, understand customer needs and preferences, and suggest what the human associate could say next,” Luk explained. “This fusion of human and AI capabilities can facilitate highly personalized and engaging customer interactions and allow associates to handle a wider range of queries, which feel more relatable and less transactional.” Key considerations for adopting generative AI in CX pipelines Luk emphasized that AI’s responses must reflect a particular brand’s voice and values. “The technology should be able to adapt to your brand’s unique tone and communication style. This consistency helps maintain your brand image and identity in AI-driven interactions.” Peter van der Putten, director of AI Lab at low-code AI platform Pega , suggested that granting LLM access to internal documents and data can empower the tool to comprehend brand voice based on historical data. This then enables AI to take appropriate actions.
“By providing consumer-facing AI models with documents and information that are not typically included in generic models or accessible to them, companies can empower their chatbots to offer references to specific services or products,” said Putten.
Jonathan Rosenberg, CTO and head of AI at cloud contact center solutions firm Five9 , pointed out that chatbots often tend to hallucinate (make up false information).
“Therefore it is important to include a human in the loop so that it compensates for [that] tendency,” he said. “It also creates a personalized experience for the customer. When they call back, the next agent will be able to know what happened previously.” Mitigating generative AI challenges Likewise, the emergence of generative AI has added complexity to the discussion surrounding AI risks from hallucination, said Menon. He emphasized that even with the utmost caution, chatbots are susceptible to adversarial attacks, including prompt injections.
Consequently, it becomes crucial to establish responsible AI strategies and architectures to mitigate these challenges.
“The significance of responsible AI in this context cannot be underestimated,” said Menon. “For enterprises leveraging generative AI, it is a strategic imperative.” Sheth from Gupshup agreed, highlighting that AI models can sometimes result in discriminatory outcomes. Therefore, businesses must exercise caution and be aware of potential bias in their models. Failure to mitigate bias can make it difficult to interpret the operational processes of these models.
“Given that generative AI models are still in their early stages of development, it can lead to concerns about trust,” said Sheth. “Businesses need to build trust with their customers and stakeholders by being transparent about how they use these technologies and by ensuring that they are used responsibly and ethically.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,598 | 2,023 |
"Squint raises funding to expand its AR-powered technology platform | VentureBeat"
|
"https://venturebeat.com/ai/squint-raises-funding-to-expand-its-ar-powered-technology-platform"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Squint raises funding to expand its AR-powered technology platform Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Squint , a technology platform that uses augmented reality (AR) to optimize factory procedures, today announced the completion of a combined seed and pre-seed funding round, raising $6 million.
Squint’s AR platform provides factory operators with an intuitive mobile experience, offering dynamic and contextual assistance that aligns with their immediate surroundings.
By bringing traditional shop floor instructions to life, Squint delivers an engaging and effective interactive experience that accelerates learning speed and enhances knowledge retention.
“What drew us to Squint was the combination of an outlier founder in [CEO] Devin [Bhushan], whose background makes him a leading expert in enterprise AR, and a compelling value proposition that is resonating with large enterprise customers like Siemens, Volvo, and more,” Jess Lee, partner at Sequoia, said in a written statement. “Squint uses mobile AR, computer vision , and machine learning to replace paper binders, sticky notes, and human trainings. The way it optimizes factory procedures is incredibly powerful and can potentially increase factory operations, transforming the way workers interact with machines and applications beyond the factory floor.” VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! According to the company, its next-gen technology enables organizations to seamlessly digitize standard operating procedures, eliminating the need for an IT team and relying solely on a mobile phone. Consequently, routine tasks such as training, operation and maintenance can be completed more efficiently and with improved safety.
“Our solution is currently being used by companies across the manufacturing and energy sectors to help them optimize and scale individualized learning. For example, we are transforming a factory’s standard operating procedures (SOP) with AR,” Squint’s CEO Devin Bhushan told VentureBeat. “Now, instead of relying on paper-based instructions and hours of human training, Squint offers an intuitive, engaging, contextual way for operators to learn how to use the machines safely, effectively and much more quickly.” Bhushan said that Squint’s unparalleled flexibility is one of its distinguishing features. He claims the platform is the only AR solution in the market that embraces an “open world” approach. Unlike other solutions, Squint enables content creation on the spot without needing a QR code or a 3D model to detect objects and align itself.
“Our implementation process is straightforward and self-serve, offering the manufacturing technology sector the fastest time-to-value,” he said. “The results have been fantastic, as our early customers reported that by using Squint, operator training time is reduced by 86%. What used to take weeks and months can now be done in as little as a day.” Using the power of AR to optimize factory procedures According to Bhushan, Squint’s self-guided teaching format empowers operators by allowing them to pause at each step of the procedure and absorb the content through their preferred media, such as reference photos, tutorial videos or written instructions.
In traditional training scenarios, operators might feel embarrassed or hesitant to interrupt a trainer and ask for help or for a demonstration to be repeated.
With Squint, the instructions are always readily available to the operator, minimizing costly errors and enhancing productivity.
Operators can also use the Squint app to create instant digital “sticky notes” that capture equipment-specific knowledge. They can then “pin” these notes in AR for their co-workers to access later.
“It takes just 45 seconds to map a new area into Squint, and it is the only product in the world that can identify a space or an object without a QR code or 3D model,” explained Bhushan. “The benefit to an open world environment without QR codes or 3D models is that we’ve removed all barriers for a new technology solution to be successful and be immediately useful for operators. There are no other solutions or platforms capable of this degree of autonomous knowledge capture and flexibility, which ultimately empowers operators with the information they need to be safe and successful.” The company’s impressive roster of clientele includes industry giants Siemens and The Volvo Group, which have utilized Squint’s AR technology to optimize their factory operations and training.
“Once you see how AR can supplement and transform written SOPs with visual and contextual orientations and demonstrations, you really can’t go back. Squint improves learning speed by giving each operator clear, step-by-step training instructions off their phone or tablet,” said Bhushan. “Squint is always available with an operator, eliminating knowledge or memory gaps, proceeding contextually at the same pace as the operator, and capturing site-specific knowledge that enables an operator to troubleshoot local machine performance.” What’s next for Squint? With the new funding, the company aims to introduce new verticals in its computer vision, AR and AI projects.
Bhushan said operators have consistently provided positive feedback on Squint’s ease of understanding and effectiveness. The platform equips them with the necessary skills and instills confidence to work independently and safely.
“We are particularly excited about new technology we are developing at the intersection of AR and AI that will make it even easier for customers to roll out Squint across the enterprise. It’s in the final stages of development, but we hope to reveal more soon,” he said. “In the next few years, we expect AR to become ubiquitous in the workplace. Just as we’ve seen with the breakthrough of AI applications, AR will break out from gaming and gimmicks. Squint is already helping to drive this change.” Squint’s funding round was led by Arc , the pre-seed and seed-stage catalyst of Sequoia Capital , and Menlo Ventures’ venture studio, Menlo Labs , with participation from several other angel investors.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,599 | 2,023 |
"Montreal startup HumanFirst raises $5M to transform organizations' conversational data into no-code AI | VentureBeat"
|
"https://venturebeat.com/ai/montreal-startup-humanfirst-raises-5m-to-transform-organizations-conversational-data-into-no-code-ai-agents"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Montreal startup HumanFirst raises $5M to transform organizations’ conversational data into no-code AI Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Imagine you run a customer service desk at a company. You quickly notice that your employees spend a good chunk of their day answering the same questions from different customers. Wouldn’t it be nice to automate those responses and let AI answer them, freeing up your human employees’ time to handle trickier questions and situations? But unless you have a programming background, creating such an automated system might seem out of reach. At least, it was until HumanFirst burst onto the scene.
A Montreal-based tech startup, HumanFirst offers a platform for creating new enterprise applications and processes based on a company’s own conversational data — that is, records of conversations between customers and support staff — using a no-code approach, meaning you don’t need to have an advanced knowledge of programming or computer science to use it. “The fastest way to build custom AI you can trust,” is how the HumanFirst website puts it.
The company today announced $5 million in seed funding from three smaller “safe-note” financing rounds, led by Panache Ventures, and joined by Inovia, Real Ventures, BoxOne Ventures, and angels including Lookout founder Kevin Mahaffey.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! The new capital injection is set to fuel HumanFirst’s growth and customer acquisition. HumanFirst is targeting a doubling of its current small team size of 15 and a tripling of its current customer base, around 33, using the funding.
HumanFirst’s approach to no-code The no-code movement, according to Gregory Whiteside, cofounder and CEO of HumanFirst, is fundamentally reshaping the software industry. It allows individuals lacking formal programming expertise to develop their own applications.
“HumanFirst stands as a crucial bridge to this new era, unlocking the value of conversational data, which is typically hard to decipher and utilize,” Whiteside in a press release.
Meanwhile, as interest in generative AI explodes, businesses are flocking to the tech , but many are still hesitant and cautious about deploying it in a meaningful way, in part due to perceived lack of in-house AI knowledge.
>>Follow VentureBeat’s ongoing generative AI coverage<< Organizations by and large have lots of data that could be leveraged to build AI apps that help them — but they don’t know how to go about it safely and securely. That’s where HumanFirst comes in.
It allows businesses that engage in a high volume of customer interactions to automate those intelligently, using simple but powerful software workflows. In essence, it creates AI trainers to help surface and identify trends in large datasets, then test how the design of the data actually works in real time with natural language understanding (NLU) and large language model (LLM) inputs.
The AI responses are all based on a business’s conversational data, ensuring they are customized to match the recurring issues that come up between the business and its customers. The data that HumanFirst analyzes to allow customer businesses to build their own apps include call transcripts, emails, customer feedback and support tickets.
“We also integrate directly with different conversational databases like Rasa, Dialgflow and more,” Whiteside told VentureBeat in an email.
Panache Ventures’ Partner, Scott Loong, commended HumanFirst tech that allows businesses to extract and operationalize customer and business insights from large conversational datasets. “HumanFirst’s data-centric approach to simplifying and improving AI models for both technical and non-technical users is unprecedented,” Loong said in a statement.
>>Don’t miss our special issue: The Future of the data center: Handling greater and greater demands.
<< HumanFirst deploys “AI trainers” within its customer organizations, which smartly cluster and bucket information, allowing employees at the customer organization to observe and discover trends and insights. These trainers merge employees’ human-context and perspective with mass labeling and indexing of the growing dataset quickly and effectively, a task traditionally done on spreadsheets.
Early customer win HumanFirst’s solutions are already streamlining business operations for various customers across industries.
“HumanFirst has a great list of enterprise customers including banks, call centers, and companies that trust the product and have built AI/automations that save time, money and uncover richer insights,” Whiteside told VentureBeat via email.
Intelcom, Canada’s last-mile delivery leader, is among the initial batch of clients currently using HumanFirst. Jean-Sébastien Joli, CEO of Intelcom, said that HumanFirst had already transformed his company’s call centers, allowing them to more rapidly identify customer issues and begin solving them without wasting customers’ or support staffs’ time trying to understand one another.
“HumanFirst has revolutionized our ability to understand data at scale. Their data-driven insights are integral in crafting superior customer and product experiences. The incorporation of HumanFirst’s automation recommendations in our call center is projected to reduce repetitive interactions by up to 50% during the initial phase,” he said in a press release.
Whiteside said he believes a human will always remain in the loop, but that HumanFirst can democratize access to building AI and make dealing with large, scaling datasets feasible for more organizations. At the same time, he did not sugarcoat the point that his customers might use his technology to reduce their human staff.
“If we can unlock the potential of richer insights and better collaboration across teams — we think that companies may reduce the number of people in some areas of the business, but also, can empower talented people to observe, launch and build impactful products and experiences for their business in ways they couldn’t dream of before,” Whiteside wrote to VentureBeat.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,600 | 2,023 |
"Marketing software Anyword adds integrations to ChatGPT, Notion AI, and Canva AI | VentureBeat"
|
"https://venturebeat.com/ai/marketing-software-anyword-adds-integrations-to-chatgpt-notion-ai-and-canva-ai"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Marketing software Anyword adds integrations to ChatGPT, Notion AI, and Canva AI Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Anyword, a marketing copywriting software company, was early with some of the technology trends and buzzwords now gripping Silicon Valley, like natural language processing (NLP) and AI. It has deployed both for years, training its GPT 3-based copywriting AI model on two billion datapoints from marketing copy across industries. This enables its marketer customers to create copy optimized to perform well and get specific demographics to click.
Little wonder that in 2015 , it secured series A funding from former Google CEO Eric Schmidt and an early customer, The New York Times.
Today, the company is adding more to its in-house AI offerings, bringing its customers new integrations with the buzziest AI models in marketing: OpenAI’s ChatGPT , Notion AI from note-taking and brainstorming startup Notion, and the graphic design web app Canva AI.
(Anyword, now headquartered in New York City, was founded 10 years ago in Israel under a different name: Keywee.) VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Anyword’s goal is conceptually simple but logistically ambitious: to allow the marketers and creative clients who are already using (or considering using) its web software to be able to use those three aforementioned third-party AI tools while staying aligned with their brand voice and what has previously resonated with their target audiences.
As Anyword puts it in a news release: “ChatGPT and other LLMs are not built for marketing; they don’t know the brand’s voice, product, or customers, or what resonates with their base.” Instead, in Anyword’s vision of AI for marketers, Anyword performs an “instant website scan” on a client’s website and then scores the outputs of a marketer’s prompts to ChatGPT, Notion AI or Canva AI, grading them on a 1-100 scale with how well they align to client’s “tone of voice, brand rules, product and company details, and target audiences.” Anyword customers can then also click “Boost Performance” to further refine their copy to fit the target audience and brand voice, ensuring their marketing emails or advertisements are as resonant and engaging as they can be.
“Users can seamlessly incorporate their brand identity and target audiences into an LLM [large language model] using our integration,” said Yaniv Makover, CEO and cofounder of Anyword, in a statement provided to press. “That means the copy they create with ChatGPT, Notion AI, Canva AI — and soon anywhere they generate or write content — will be on-brand and performance-driven, with predictive analytics that has been shown to increase conversion by 30%.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,601 | 2,023 |
"Jasper launches new marketing AI copilot | VentureBeat"
|
"https://venturebeat.com/ai/jasper-launches-new-marketing-ai-copilot-no-one-should-have-to-work-alone-again"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Jasper launches new marketing AI copilot: ‘No one should have to work alone again’ Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Following layoffs earlier this year , marketing software platform Jasper is recalibrating.
Today, the company announced its new “end-to-end AI copilot for better marketing outcomes,” according to Timothy Young, the new CEO of Jasper as of two weeks ago, in an exclusive interview with VentureBeat.
The features Jasper announced today include new performance analytics to optimize content, a company intelligence hub to align messaging with brand strategy, and campaign tools to accelerate review cycles. The features will start rolling out in beta in November, with additional capabilities planned for Q1 2024.
By outlining the campaign parameters upfront, Jasper’s new copilot can not only generate customized content for each channel but also schedule, distribute, and track the entire effort. This elevates Jasper from tactical content creation to true strategic campaign management.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Crucially, the copilot also closes the feedback loop by integrating performance analytics.
This allows users to demonstrate the real business impact of their campaigns to stakeholders.
Young, a former leader at Dropbox and VMWare, said Jasper is “up leveling the capabilities…from just an individual piece of content to groupings of content that are strategically valuable.” Tapping into company intelligence In a product briefing with VentureBeat, Zach Anderson, Jasper’s VP of Product and Customer Success, unveiled new capabilities coming to market that position Jasper as a true AI marketing copilot.
A core focus is deep personalization through “Company Intelligence.” As Anderson described, marketers can now “upload all relevant documents directly into Jasper where it is analyzed and tagged.” When generating content, Jasper will “draw directly from this company data to ensure everything is perfectly on-brand without any hallucinations.” Executives can feel assured their brand voice and strategic knowledge is authentically captured.
Performance optimization is also a major goal of Jasper’s design. As Anderson stated, marketers will closely track each content piece’s metrics within Jasper. The system “provides suggestions on how to iteratively improve underperforming assets directly within the app.” With each new campaign, Jasper aims to learn how to better achieve its customers’ goals by focusing on real business results.
Evolution in layers Young believes the generative AI space will evolve in layers, much like past technology waves.
“Over the last couple of decades, if you look inside enterprises, the value where they start is in the application layer, and then it moves down into the network stack and into the infrastructure stack,” said Young. This maturation process means the industry is poised for tremendous growth and specialization as different companies find ways to build targeted applications.
“Individual companies… are just going to find ways to build applications that really speak to end users, that are focused on verticals, that are focused on sectors, and all the specific challenges and value that they require,” Young said.
He sees the potential for thousands of companies addressing unique enterprise needs in specialized ways, just as no single tool has ever been sufficient on its own.
While acknowledging “there’s a lot of value in diversity and competition,” Young doesn’t view tech giants dabbling in the space as a major threat long-term. Their general tools won’t replace the specialization that organizations demand.
AI as critical infrastructure Young believes LLMs will become critical digital infrastructure underpinning all knowledge work within enterprises. But how organizations interface with and leverage AI will depend greatly on their unique contexts, needs, and regulatory environments. Some may run customized models internally while others access open models through strategic partners.
While Young couldn’t comment on the profitability of Jasper during the interview, an annual recurring revenue (ARR) growth of four times by the enterprise sales team was highlighted in the release. After a summer which saw layoffs and increasing skepticism of the long term valuation of companies which build on top of established LLM makers (that could eat their lunch ), Young says that history has shown even in saturated markets, targeted applications can thrive alongside dominant players.
Asked about the layoffs directly, Young said: “Jasper was an early leader in generative AI for marketing. As the field evolved, the company also needed to reshape its team and evolve its focus. Unfortunately that did lead to a reduction of staff in July to enable the company to have the right fuel and focus for this next chapter. The company’s customer base has really changed over the last year with larger companies becoming the fastest growing segment. This expansion in our product and the continuous upleveling of our capacity is a sign of that focus.” Examples like Zoom moving into areas like word processing does not faze Young, as he believes organizations will continue demanding specialized, best-of-breed tools tailored to their unique needs. Just as Dropbox has endured against giants, Young is confident Jasper’s verticalized, outcomes-driven approach will allow it to carve out durable niches even amid growing platform encroachment.
Rather than trying to compete feature-for-feature, Jasper will look to embed its AI seamlessly into customers’ existing workflows on various platforms.
By delivering uniquely strategic insights and seamlessly coordinating complex initiatives, the company aims to make its copilot indispensable for optimizing how people work across any environment or application. For Young, strategic partnerships – not direct competition – will be the key to long-term success.
For companies like Jasper to thrive, Young stressed the importance of deeply understanding customers. “Staying close to your customers” and finding specific insights to build specialized value around is key, he said. If software enables customers quickly and earns their trust, durable strategic relationships can emerge where companies help further broader organizational goals.
Securing proprietary knowledge When discussing protecting and safeguarding sensitive customer data, Anderson said the company employs its proprietary “Jasper AI engine” technology to ensure no customer information is exposed externally.
“Eeverything [customers] upload to Jasper, their prompts, their execution strategy that they’re putting into Jasper stays with Jasper,” Young said.
According to Anderson, some other tools may send direct customer inputs and outputs to large language models. In contrast, the Jasper AI engine allows querying external models for tasks, but does not share or pass along any customer data during this process. All collected intelligence remains internal to Jasper.
For executives wary of competitors accessing strategic assets, the internal data protections of the Jasper AI engine provide confidence. The risk of inadvertently training a rival on a company’s most sensitive plans or innovations due to language model exposure is mitigated. Proprietary knowledge stays fully protected under the customer’s control at all times.
By keeping customer training separate from external AI via its proprietary engine, Jasper establishes an important security standard executives should find reassuring.
Beyond personalization and insights, marketers can generate entire campaign blueprints directly from high-level briefs using Jasper’s new “Campaign Acceleration” features. The tool actively drives the content creation process from ideation to execution and optimization.
Emotional impact key to standing apart For Young, one of the most exciting aspects of Jasper’s technology is the profound emotional impacts it can have on knowledge workers. He recounted being moved while witnessing an exuberant reaction after Jasper generated over 100 product descriptions for her e-commerce site in minutes.
“Those moments where technology really invokes someone deeply at an emotional level, I just think, are incredibly rare,” Young said.
By helping eliminate the friction and loneliness of the creative process, Young believes AI assistants can rekindle that sense of childlike wonder people feel when discovering profoundly useful new tools.
“No one should have to work alone again,” said Young. By serving as a conversational companion that sparks new ideas and surfaces actionable insights from company data, Jasper aims to get people firmly “in the flow of work” through natural dialogue. Young hopes this makes the workday feel less solitary and allows employees to consistently achieve their peak performance.
If Jasper and other AI tools can help more workers feel like “heroes” both on the job and at home, Young believes it will not only boost productivity but profoundly impact peoples’ overall well-being and job satisfaction for the better. For him, technologies that forge deep emotional connections by empowering users are uniquely positioned for long-term success.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,602 | 2,023 |
"How generative AI can democratize content creation across the enterprise | VentureBeat"
|
"https://venturebeat.com/ai/how-generative-ai-can-democratize-content-creation-across-the-enterprise"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages VB Event How generative AI can democratize content creation across the enterprise Share on Facebook Share on X Share on LinkedIn Generative AI for enterprises is rapidly becoming a competitive differentiator. One of its most transformative powers is its impact on content creation and productivity, explained Anna Griffin, CMO at Commvault, in a conversation with May Habib, CEO and co-founder of Writer.com at VB Transform 2023.
“I realized the power of what ChatGPT was going to do in the name of synthesizing, time to insights, quickly bringing in market points of view to areas like marketing where there could be a lot of time, a lot of money, a lot of energy spent,” Griffin said. “It becomes a very complementary tool to a go-to-market model. I was intrigued instantly. In that moment, I thought that this marketing organization is going to own gen AI in our go-to-market. We’re going to own how to do this and how to do this well and right.” Griffin reached out to Writer in order to implement a company-wide solution that could transform the way Commvault works.
“We knew we would tackle a pretty aggressive series of use cases, because we were launching a new product, repositioning our brand, changing our narrative,” Griffin said. “All of these things produce a massive amount of content.” Democratizing content creation With their generative AI solution, they were able to create the content they needed rapidly, including co-writing and custom writing blogs from scratch, to prospecting emails for the sales force, web pages and more. The data was trained to understand the messages the company wants to tell in the market, and syncing it with the company’s SEO strategy, and keep up the cadence of their content creation.
“Even though I deeply believe in the art of advertising, I love the tool’s ability to just generate ideation,” she said. “There’s something so interesting about the tool, particularly when you’re launching and you’re forming your messaging, what you’re going to be, how we’re going to take it to market. The tool ideates with you. You’re feeding it. It’s throwing out suggestions.” And it’s not just a powerful marketing tool — it’s a vital tool for users across the organization. Since it’s able to ensure that the right voice and message is standard across the organization, users without technical writing skills can easily produce the content they need in company style.
It’s also invaluable in the strategy planning phase and customer case studies, able to synthesize content from interviews and discussions across multiple teams, boiling it down to an analysis of the customer, their pain point, and their perspective on the market, the product and competitors.
“You can roll it up into an executive summary of the whole piece,” she said. “That saves hours and hours. It will also inevitably save the industry a ton of money. I still believe in the art of research and researchers, but the price that you pay in synthesizing and analyzing comes in time and money. It was a huge asset there as well.” What’s deeply important is that it be human-led AI, human-trained AI, she added. Though it will replace some jobs, because content has always been king, and this is a tool for content creation at a massive scale, it’s critical that a human controls and supervises the process from start to finish.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,603 | 2,023 |
"Hollywood's battle over AI and 3D scanning, explained | VentureBeat"
|
"https://venturebeat.com/ai/hollywoods-strike-battle-over-ai-and-3d-scanning-has-been-decades-in-the-making"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages As Hollywood strikes, battle over AI and 3D scanning has been decades in the making Share on Facebook Share on X Share on LinkedIn Image Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Hollywood has been largely shut down for more than 100 days now, after the union representing screenwriters, the Writers Guild of America (WGA), voted to go on strike on May 1.
The writers were soon followed by the actors’ union, the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA), on July 13 , marking the first time in 63 years that both major unions were on strike at the same time.
Both unions have objected to contract renewal proposals from the Alliance of Motion Picture and Television Producers (AMPTP). A key sticking point is the use of artificial intelligence (AI) and 3D scanning technology. The producers, and the major movie studios behind them, want a broad license to use the tech however they wish. The writers and actors want an agreement on specific rules for how, when and where it can be used.
While the two sides continue to duke it out through their negotiators, VentureBeat took a close look at the actual tech at issue, and discovered that there is an important distinction to be made if the dueling sides are to come to a mutually satisfactory agreement: 3D scanning is not the same as AI, and most vendors only offer one of the two technologies for filmmaking.
The tech vendors largely also believe actors and writers should be compensated for their work in whatever form it takes, and that the vendors’ business would suffer if actors were replaced with 3D doubles and writers with generated scripts.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! But things are changing quickly. VentureBeat learned of plans by an AI vendor, Move.AI , to launch next month a new motion capture app using a single smartphone camera — a development that would radically reduce the cost and complexity of making 3D digital models move. Separately, a 3D scanning company, Digital Domain, shared its intent to use AI to create “fully digital human” avatars powered by AI chatbots.
3D scanning is not the same as AI, and only one is truly new to Hollywood While some 3D scanning companies are pursuing AI solutions for helping them create interactive 3D models of actors — known variously as digital humans, digital doubles, digital twins, or virtual doppelgängers — 3D scanning technology came to Hollywood long before AI was readily available or practical, and AI is not needed to scan actors.
However, if realistic 3D scans are to one day replace working actors — perhaps even in the near future — an additional, separate layer of AI will likely be needed to help the 3D models of actors move, emote and speak realistically. That AI layer largely does not exist yet. But companies are working on tech that would allow it.
Understanding exactly who are some of the tech vendors behind these two separate and distinct technologies — 3D scanning and AI — and what they actually do is imperative if the conflicting sides in Hollywood and the creative arts more generally are to forge a sustainable, mutually beneficial path forward.
Yet in Hollywood, you could be forgiven for thinking that both technologies — AI and 3D scanning — are one and the same.
Duncan Crabtree-Ireland, the chief negotiator for SAG-AFTRA, revealed that the studios proposed a plan in July to 3D-scan extras or background actors and use their digital likenesses indefinitely. This proposal was swiftly rejected by the union. “We came into this negotiation saying that AI has to be done in a way that respects actors, respects their human rights to their own bodies, voice, image and likeness,” Crabtree-Ireland told Deadline.
Meanwhile there have been increasing reports of actors being subjected to 3D scanning on major movie and TV sets, causing unease within the industry.
The first week of the strike, a young actor (early 20s) told me she was a BG actor on a Marvel series and they sent her to “the truck” – where they scanned her face and body 3 times. Owned her image in perpetuity across the Universe for $100. Existential, is right.
The main conflict Though 3D actor scanning has been around for years, Hollywood executives like those at Disney are reportedly excited about the addition of generative AI to it, and about AI’s overarching prospects for new, more cost-effective storytelling. But the increasing availability of the technology has also sparked major concerns from writers and actors as to how their livelihoods and crafts will be affected.
When it comes to Hollywood writers, the recent launch of a number of free, consumer-facing, text-to-text large language model (LLM) applications such as ChatGPT, Claude and LLaMA have made it much easier for people to generate screenplays and scripts on the fly.
Reid Hoffman, a backer of ChatGPT maker OpenAI, even wrote a whole book with ChatGPT and included sample screenplay pages.
Another app, Sudowrite , based on OpenAI’s GPT-3, can be used to write prose and screenplays, but was the target of criticism several months ago from authors who believed that it was trained on unpublished work from draft groups without their express consent.
Sudowrite’s founder denied this.
Meanwhile, voice cloning AI apps like those offered by startup ElevenLabs and demoed by Meta are also raising the prospect that actors won’t even need to record voiceovers for animated performances, including those involving their digital doubles.
Separately, though 3D body-scanning is now making headlines thanks to the actors’ strike, the technology behind it has actually been around for decades, introduced by some of cinema’s biggest champions and auteurs, including James Cameron, David Fincher, and the celebrated effects studio Industrial Light and Magic (ILM).
Now with the power of generative AI, those 3D scans that were once seen as extensions of a human actor’s performance on a set can be repurposed and theoretically used as the basis for new performances that don’t require the actor — nor their consent — going forward. You could even get an AI chatbot like ChatGPT to write a script and have a digital actor perform it. But because of the inherent complexity of these technologies, they are all generally, and improperly, conflated into one, grouped under the moniker du jour, “AI.” The long history of 3D scanning “We’ve been at this for 28 years,” said Michael Raphael, CEO, president and founder of Direct Dimensions , in an exclusive video interview with VentureBeat.
Direct Dimensions is a Baltimore-based 3D scanning company that builds the scanning hardware behind some of the biggest blockbusters in recent years, including Marvel’s Avengers: Infinity War and Avengers: Endgame.
The firm’s first subject in Hollywood was actor Natalie Portman for her Oscar-winning turn in the 2010 psychosexual thriller Black Swan.
Raphael, an engineer by training, founded the company in 1995 after working in the aerospace industry, where he helped develop precision 3D scanning tools for measuring aircraft parts, including an articulating arm with optical encoders in the joints.
However, as the years passed and technology became more advanced, the company expanded its offerings to include other scanning hardware such as laser scanning with lidar (light ranging and detection sensors, such as the kind found on some types of self-driving cars ), as well as still photos taken by an array of common digital single reflex cameras (DSLR) and stitched together to form a 3D image, a technique known as photogrammetry.
Today, Direct Dimensions works not only on movies, but on imaging industrial parts for aerospace, defense and manufacturing; buildings and architecture; artworks and artifacts; jewelry; and basically any object from the small to the very large. In fact, Hollywood has only ever made up a small portion of Direct Dimensions’ business; most of it is precision 3D scanning for other, less glamorous industries.
“We scan anything you can think of for basically engineering or manufacturing purposes,” Raphael told VentureBeat.
In order to scan small objects, Direct Dimensions created its own in-house hardware: an automated, microwave-sized scanner it calls the Part Automated Scanning System (PASS).
Importantly, Direct Dimensions does not make its own AI software nor does it plan to. It scans objects and turns them into 3D models using off-the-shelf software like Autodesk’s Revit.
The short list of 3D scanners Raphael said Direct Dimensions was only one of about a “dozen” companies around the world offering similar services. VentureBeat’s own research revealed the following names: Avatar Factory (Australia) Digital Domain (United States) Direct Dimensions (United States) Eisko (France) Scan Engine/Unit Image (France) ScanLab Photogrammetry (Canada) Reblika (Netherlands) Lola VFX (United States) One such 3D scanning company, Avatar Factory from Australia, is run by a family of four: husband and wife Mark and Kate Ruff, and their daughters Amy and Chloe.
The company was founded in 2015 and offers a “cyberscanning” process involving 172 cameras mounted around the interior of a truck.
This allows it to provide mobile 3D scanning of actors on locations outside of studios — say, landscapes and exteriors. Like Direct Dimensions, the company also offers prop scanning.
Among the notable recent titles for which Avatar Factory has performed 3D scanning are Mortal Kombat , Elvis and Shantaram (the Apple TV series).
“The Avatar Factory create photo-realistic 3D digital doubles that are used for background replacement, as well as stunt work that is too dangerous to be performed by actual stunt doubles,” explained Chloe Ruff, Avatar Factory’s CEO, chief technology officer (CTO) and head of design, in an email to VentureBeat.
While Ruff said that Avatar Factory had used 3D scanning of multiple extras or background actors to create digital crowd scenes, she also said that without the variety they contributed, it would be detrimental to the work.
“As so much of our work is for background replacement we see hundreds of extras and background actors come through our system on a typical shoot day,” Ruff wrote. “Having extras and background actors be on a film set is fundamental to our business operations and we couldn’t do what we do without them. It would be devastating to the industry and our business if all of those actors were to be replaced by AI, like some studios are suggesting.” AI-assisted 3D scanning is in the works Separately, rival 3D scanning company Digital Domain, co-founded in 1993 by James Cameron, legendary effects supervisor Stan Winston and former ILM general manager Scott Ross, declined to comment for this story on the controversy over scanning background actors.
However, a spokesperson sent VentureBeat a document outlining the company’s approach to creating “digital humans,” 3D models of actors derived from thorough, full-body scans that are “rigged” with points that allow motion. The document contains the following passage: “In most cases, direct digital animation is used for body movements only, while facial animation almost always has a performance by a human actor as the underlying and driving component. This is especially true when the dialog is part of the performance.” The Digital Domain document goes on to note the increasing role of AI in creating digital humans, saying, “We have been investigating the use of generative AI for the creation of digital assets. It’s still very early days with this technology, and use cases are still emerging.” The document also states: “We feel the nuances of an actor’s performance in combination with our AI & Machine Learning tool sets is critical to achieving photo realistic results that can captivate an audience and cross the uncanny valley.
“That said, we are also working on what we call Autonomous Virtual Human technology. Here we create a fully digital human, either based on a real person or a synthetic identity, powered by generative AI components such as chatbots. The goal is to create a realistic virtual human the user can have a conversation or other interaction with. We believe that the primary application of this technology is outside of entertainment, in areas such as customer service, hospitality, healthcare, etc…” Industrial Light and Magic (ILM) was at the forefront How did we get here? Visual effects computer graphics scholars point to the 1989 sci-fi film The Abyss , directed by James Cameron of Titantic , Avatar , Aliens and Terminator 2 fame, as one of the first major movies to feature 3D scanning tech.
Actors Ed Harris and Mary Elizabeth Mastrantonio both had their facial expressions scanned by Industrial Light and Magic (ILM), the special effects company founded earlier by George Lucas to create the vivid spacefaring worlds and scenery of Star Wars , according to Redshark News.
ILM used a device called the Cyberware Color 3-D Digitizer, Model 4020 RGB/PS-D, a “plane of light laser scanner” developed by a defunct California company for which the device was named. The U.S. Air Force later got ahold of one for military scanning and reconnaissance purposes, and wrote about it thusly : “This Cyberware scanning system is capable of digitizing approximately 250,000 points on the surface of the head, face, and shoulders in about 17 seconds. The level of resolution achieved is approximately 1 mm.” For The Abyss , ILM scanned actors to create the “pseudopod,” a watery shapeshifting alien lifeform that mimicked them. This holds the distinction of being the first fully computer-generated character in a major live-action motion picture, according to Computer Graphics and Computer Animation: A Retrospective Overview , a book from Ohio State University chronicling the CGI industry’s rise, by Wayne E. Carlson.
Raphael also pointed to 2008’s The Curious Case of Benjamin Button , starring Brad Pitt as a man aging in reverse, complete with visual effects accompanying his transformation from an “old baby” into a young elderly person, as a turning point for 3D actor-scanning technology.
“ Benjamin Button pioneered the science around these types of human body scanning,” Raphael said.
Pressing the ‘Benjamin Button’ When making Benjamin Button , director David Fincher wanted to create a realistic version of lead star Brad Pitt both old and young. While makeup and prosthetics would traditionally be used, the director thought this approach would not give the character the qualities he wanted.
He turned to Digital Domain, which in turn looked to computer effects work from Paul Debevec , a research adjunct professor at the University of Southern California’s (USC) Institute for Creative Technologies (ICT), who today also works as a chief researcher at Netflix’s Eyeline Studios.
According to Debevec’s recollection in a 2013 interview with the MPPA’s outlet The Credits , Fincher “had this hybrid idea, where they would do the computer graphics for most of the face except for the eyeballs and the area of skin around the eyes, and those would be filmed for real and they’d put it all together.” In order to realize Fincher’s vision, Digital Domain turned to Debevec and asked him to design a “lighting reproduction” system whereby they could capture light and reflections in Pitt’s eyes, and superimpose the eyes onto a fully digital face.
Debevec designed such a system using LED panels arranged like a cube around the actor, and later, brought in a physical sculpture of Pitt’s head as a 70-year-old man and used the system to capture light bouncing off that.
“Ever since I started seriously researching computer graphics, the whole idea of creating a photo-real digital human character in a movie, or in anything, was kind of this Holy Grail of computer graphics,” Debevec told The Credits.
The approach worked: The Curious Case of Benjamin Button went on to win the 2009 Academy Award for Best Achievement in Visual Effects.
And, the team got closer to Debevec’s “Holy Grail,” by creating a fully CGI human face.
According to Mark Ruff of Avatar Factory, the fact that Benjamin Button achieved such a lifelike representation of Brad Pitt, yet Pitt continues to act in new films, helps explain why 3D scans will not be displacing human actors anytime soon.
“It was conceivable back then that Brad Pitt no longer needed to appear in future films,” Mark told VentureBeat. “His avatar could complete any future performance. Yet, we still see Brad Pitt acting. Even if Brad Pitt were scanned and did not perform himself ever again in a film, I am sure his agent would still acquire a premium for his identity.” Say hello to digital humans Today, many companies are pursuing the vision of creating lifelike 3D actors — whether they be doubles or fully digital creations.
As The Information reported recently, a number of startups — Hyperreal, Synthesia, Soul Machines and Metaphysic — have all raised millions on the promise they could create realistic 3D digital doubles of leading A-list stars in Hollywood and major sports.
This would allow stars to reap appearance fees without ever setting foot on set (while the agents took a cut). In fact, it could create a whole new revenue stream for stars, “renting” out their likenesses/digital twins while they pursue higher-quality, more interesting, but possibly lower-paying passion projects.
In July, VentureBeat reported that Synthesia actually hired real actors to create a database of 39,765 frames of dynamic human motion that its AI would train on. This AI will allow customers to create realistic videos from text, though the ideal use case is more for company training videos, promotions and commercials rather than full feature films.
“We’re not replacing actors,” the company’s CEO, Jon Starck, told VentureBeat. “We’re not replacing movie creation. We’re replacing text for communication. And we’re bringing synthetic video to the toolbox for businesses.” At the same time, he said that an entire movie made out of synthetic data was likely in the future.
The industry is moving fast from the days when deepfake images of Tom Cruise plastered on TikTok creators’ faces (powered by the tech that went on to become Metaphysic) and Bruce Willis renting out his own deepfake were making headlines.
Now, just one or two years later, “many stars and agents are quietly taking meetings with AI companies to explore their options,” according to The Information’s sources.
AI-driven motion capture Of course, creating a digital double is a lot easier said than done. And then, animating that double to move realistically is another ballgame entirely.
Motion capture — the technology that allows human movements to be reproduced in animation or computer graphics — has been around for more than 100 years , but the modern tools didn’t come into effect until the 1980s.
And then, for the subsequent two decades, it mostly involved covering actors in tight-fitted bodysuits covered with ping pong-ball like markers, and using specialized cameras to map their movements onto a digital model or “skeleton” that could be turned into a different character or re-costumed with computer graphics.
But today, thanks to advances in AI and software, human motion can be captured with a set of smartphones alone, without the need of pesky suits and markers. One such company taking the “markerless” smartphone route is U.K.-based Move.ai , founded in 2019 to capture athletes’ movements, and which has since branched off into video games and film.
“Creating 3D animation might seem like quite a niche market, but it’s actually a huge market, over $10 billion,” said Tino Millar, CEO and cofounder of Move.ai, in a video interview with VentureBeat.
Millar said that in the past, animating the motion of 3D characters was done largely “by hand.” Even those animators using longstanding software such as Blender or Cinema 4D have to spend many hours training and educating themselves on the tools in order to achieve the quality necessary for major films.
The other alternative, the marker and tight-fitted suit approach described above, is similarly time-intensive and requires an expensive studio setup and multiple infrared cameras.
“What we’ve come along and done is, using AI and a few other breakthroughs in understanding human motion in physics and statistics, is that we believe we can make it 100 to 1,000 times cheaper to do than with motion capture suits, while maintaining the quality, and making it much more accessible to people,” Millar said.
In March 2023, Move.ai launched a consumer-facing smartphone app that requires at least two (and up to six) iPhones running iOS 16 to be positioned around a person to capture their motion.
Since then, “it’s being used by top game companies around the world, top film and TV productions, [and] content creators at home creating video for YouTube and TikTok,” Millar said.
Move.ai also supports Android devices in an “experimental” mode, and Millar told VentureBeat the company plans to launch a single-smartphone camera version of its app next month, September 2023, which would further reduce the barrier to entry for aspiring filmmakers.
AI’s increasing availability to consumers stokes fears So, to recap: 3D scanning and improved motion-capture tech has been in the works in Hollywood for years, but has lately become much more affordable and ubiquitous, and AI tech has only recently become publicly available to consumers and Hollywood.
“It’s one thing to have these [3D] assets, and they’ve had these assets for 10 years at least,” said Raphael of Direct Dimensions. “But the fact that you’re adding all this AI to it, where you can manipulate assets, and you can make crowd scenes, parade scenes, audiences, all without having to pay actors to do that — the legality of all this still needs to be worked out.” This trickle-down effect of both technologies has come just as the actors and writers had to renegotiate their contracts with studios, and as the studios have embraced yet another new technology — streaming video.
All of which has concocted a stew of inflated hype, real advances, fear and fearmongering, and mutual misunderstandings that have boiled over into the standoff that has now gone on for more than 100 days.
“I can only speculate,” Millar of Move.ai said. “But AI is much more in popular culture. People are much more aware of it. There is AI in their devices now. In the past, people weren’t aware of it because it was only being used by high-end production companies. The high end will always have the bleeding edge, but a lot of this technology is filtering down to consumers.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,604 | 2,023 |
"First look: Inside Modyfi’s push to build the future of graphic design | VentureBeat"
|
"https://venturebeat.com/ai/first-look-inside-modyfis-push-to-build-the-future-of-graphic-design"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages First look: Inside Modyfi’s push to build the future of graphic design Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Modyfi , a startup founded by Snap and Amazon alums, has launched a public beta of its AI-powered graphic design software and announced on Wednesday a $7 million funding round led by NEA.
The newly-released app is browser-based and looks very similar to the dual-columned interface found in other graphic design applications like Adobe’s Creative Cloud software or Affinity’s suite of programs. The significant difference is found in the new AI-powered command field in the top middle of the UI, and the new approach to adaptive and context-aware pattern and placement tools.
The role of AI in the design space has captured the attention of the largest players , with OpenAI’s very first acquisition being Global Illumination Inc, a studio that builds “creative tools, infrastructure, and digital experiences,” as OpenAI wrote in the blog post announcing the acquisition.
In an exclusive interview with VentureBeat, Modyfi cofounder Joseph Burfitt showed off how easy it was to create eye-catching designs, and explained the company’s vision and future plans. “The three key things that we care about the most [are the] graphic design suite, process and collaboration, and then the AI capabilities,” he said.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Burfitt demonstrated how Modyfi allows designers to quickly conceptualize designs by dragging and dropping images, applying effects and modifiers with natural language commands, and generating variations using its “image-guided generation” model. The tool also enables real-time collaboration so multiple designers can work on a file simultaneously.
Burfitt explained that because familiarity is key, Modyfi was built with the kind of modern interface that design professionals expect to see. The new layer of AI capabilities is intended to reduce some of the ambiguity of the production process. “We also want to bring in elements where it enables the designer to remove more of that process [where] someone says, ‘Hey, can you make this image pop?’ What does [that] actually mean to someone like a graphic designer? “So rather than going backwards and forwards with the client,” Burfitt said, “the AI within the chat window can just say ‘What do you mean by pop? Increase the vibrancy? Change the saturation?’” From stealth mode to rapid scaling Burfitt explained that Modyfi had been working in stealth mode for about 18 months to build out the application before launching the public beta. The company wanted to make sure the product was solid and reliable before widely promoting it, as losing users’ work was not an option.
“We haven’t [yet] massively gone wide right now, [as] we want to make sure that it’s a very solid work and performant. We have to win the trust of our users. We can never lose any kind of content which they create on the platform,” said Burfitt.
Now that the beta is available, Burfitt said Modyfi will start ramping up awareness and growing its user base. But it plans to do so gradually to maintain quality as more people sign up and start using the design tools.
Despite a growing user count, heavy compute requirements ended up less of a concern than they might have for Modyfi. Burfitt and his team were able to mitigate the application’s need for heavy computation requirements by deploying a distributed service that can leverage GPU resources from multiple cloud providers. This allowed them to scale effectively to meet demand.
“So when people are asleep in Japan, Australia, we can actually ship our processing overseas,” said Burfitt. This avoids capacity constraints in U.S.-based compute or GPU service availability, he explained.
He also mentioned that they are using a new web standard called WebGPU, which provides even better GPU performance within browsers than previous technologies like WebGL. By taking advantage of users’ own GPU acceleration, Modyfi is able to perform tasks like background removal — and soon, depth matching and upscaling — much faster when accessing the local hardware.
Funding to support further development Looking ahead, Burfitt pointed to plans to expand Modyfi’s AI capabilities across different design styles, while keeping designers in control. He also emphasized the importance of collaboration and said the browser-based tool could evolve to be more conversational and intuitive over time.
On the business side, Burfitt said the $7 million in funding from NEA will primarily go towards further development and bringing on more engineers to tackle the complex challenges of building a graphic design platform. “Super excited to have them on. It’s incredible to have a caliber of VC like NEA who see our vision as much as we do. So [we’re] very, very excited to have them on and [we’ve] been utilizing that from a developer perspective.” With early traction among top companies, Modyfi aims to push the boundaries of what’s possible at the intersection of design and AI.
“We’ve got thousands of people using this right now and hundreds of companies like Snapchat, Reddit, Stripe — the Nvidia creative team are even using it. So [we’re] pretty, pretty excited about the distribution so far,” said Burfitt.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,605 | 2,023 |
"Datadog launches AI helper Bits and new model monitoring solution | VentureBeat"
|
"https://venturebeat.com/ai/datadog-launches-ai-helper-bits-and-new-model-monitoring-solution"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Datadog launches AI helper Bits and new model monitoring solution Share on Facebook Share on X Share on LinkedIn Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Today, New York-based Datadog , which delivers cloud observability for enterprise applications and infrastructure, expanded its core platform with new capabilities.
At its annual DASH conference, the company announced Bits, a novel generative AI assistant to help engineers resolve application issues in real-time, as well as an end-to-end solution for monitoring the behavior of large language models (LLMs).
The offerings, particularly the new AI assistant, are aimed at simplifying observability for enterprise teams. However, they are not generally available just yet. Datadog is testing the capabilities in beta with a limited number of customers and will bring them to general accessibility at a later stage.
New way to help with issue detection and remediation When it comes to monitoring applications and infrastructure, teams have to do a lot of grunt work – right from detecting and triaging an issue to remediation and prevention. Even with observability tools in the loop, this process requires sifting through massive volumes of data, documentation and conversations from disparate systems. This can take up hours, sometimes even days.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! With the new Bits AI, Datadog is addressing this challenge by giving teams a helper that can assist with end-to-end incident management while responding to natural language commands. Accessible via chat within the company platform, Bits learns from customers’ data — covering everything from logs, metrics, traces and real-user transactions to sources of institutional knowledge like Confluence pages, internal documentation or Slack conversations — and uses that information to quickly provide answers about issues while troubleshooting or remediation steps in conversational.
This ultimately improves the workflow of users and reduces the time required to fix the problem at hand.
“LLMs are very good at interpreting and generating natural language, but presently they are bad at things like analyzing time-series data, and are often limited by context windows, which impacts how well they can deal with billions of lines of logging output,” Michael Gerstenhaber, VP of product at Datadog, told VentureBeat. “Bits AI does not use any one technology but blends statistical analysis and machine learning that we’ve been investing in for years with LLM models in order to analyze data, predict the behavior of systems, interpret that analysis and generate responses.” Datadog uses OpenAI ’s LLMs to power Bits’ capabilities. The assistant can coordinate a response by assembling on-call teams in Slack and keeping all stakeholders informed with automated status updates. And, if the problem is at the code level, it provides a concise explanation of the error with a suggested code fix that could be applied with a few clicks and a unit test to validate that fix.
Notably, Datadog’s competitor New Relic has also debuted a similar AI assistant called Grok.
It too uses a simple chat interface to help teams keep an eye on and fix software issues, among other things.
Tool for LLM observability Along with Bits AI, Datadog also expanded its platform with an end-to-end solution for LLM observability. This offering stitches together data from gen AI applications, models and various integrations to help engineers quickly detect and resolve problems.
As the company explained, the tool can monitor and alert about model usage, costs and API performance. Plus, it can analyze the behavior of the model and detect instances of hallucinations and drift based on different data characteristics, such as prompt and response lengths, API latencies and token counts.
While Gerstenhaber declined to share the number of enterprises using LLM Observability, he did note that the offering brings together what usually are two separate teams: the app developers and ML engineers. This allows them to collaborate on operational and model performance issues such as latency delays, cost spikes and model performance degradations.
That said, even here, the offering has competition.
New Relic and Arize AI both are working in the same direction and have launched integrations and tools aimed at making running and maintaining LLMs easier.
Moving ahead, monitoring solutions like these are expected to be in demand, given the meteoric rise of LLMs within enterprises. Most companies today have either started using or are planning to use the tools (most prominently those from OpenAI) to accelerate key business functions, such as querying their data stack to optimizing customer service.
Datadog’s DASH conference runs through today.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,606 | 2,023 |
"AI video creation app Captions bags $25M from top VCs | VentureBeat"
|
"https://venturebeat.com/ai/ai-video-creation-app-captions-bags-25m-from-top-vcs"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages AI video creation app Captions bags $25M from top VCs Share on Facebook Share on X Share on LinkedIn Credit: Captions Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Captions , an AI-powered video creation app for iOS that launched in 2021 and counts more than three million users to date, today announced a $25 million in Series B funding led by Kleiner Perkins, with additional funding provided by Sequoia Capital, Andreessen Horowitz (a16z) and SV Angel.
The company has now raised a total $40 million. It was founded by Gaurav Misra, former head of design engineering at Snap and a former software development engineer (II) at Microsoft.
Captions initially carved its niche as a camera application catering to “talking videos”, where creators engage with the camera directly. In conjunction with its funding announcement, the company has announced a rebranding and new look that Misra wrote of in a blog post : “We believe properly represents the defining characteristics of Captions.” Those characteristics include a number of features, from AI-powered redubbing of a speaker’s words into another language while maintaining the sound of their voice, to human-like AI voiceovers, to AI powered short clip production from a longer piece of video to unique, original, AI-created royalty-free music for videos.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Transforming to a holistic creative suite Evolving from its roots, Captions has since become a holistic creative suite, leading the charge in transforming content creation through state-of-the-art AI functionalities, leveraging OpenAI’s GPT-4 large language model (LLM) and an AI Eye Contact feature developed with Nvidia.
“We recognized the need among creators using teleprompters and off-screen scripts and collaborated with Nvidia to develop the eye contact correction feature specifically for this use case,” Gaurav Misra, Captions’ cofounder and CEO said in statement provided to VentureBeat via spokesperson. “Upon its launch earlier this year, it became an instant hit, and now many other companies are developing similar technology inspired by Captions.” The AI adjusts a video host’s gaze in real time, creating the illusion that they are making eye contact with the camera.
Significant traction among business users While Captions is essentially consumer-focused, it has garnered traction among business users as well, especially those involved in social media management, content marketing, and growth marketing.
The platform appeals to a diverse audience including, but not limited to, influencers, realtors, fitness trainers, musicians and sales professionals.
Notable accounts utilizing Captions include the Disney-owned sports network ESPN and its commentator Omar Raja, “Mr. Wonderful” of Shark Tank fame, Twitch’s founder Justin Kan and the influencer Unnecessary Inventions.
Everett Randle, a partner at Kleiner Perkins, lauded Captions, stating: “AI is redefining how digital products are created, and Captions has emerged as the leading AI-powered platform for video content. Millions of users have already leveraged it to tell their stories and engage audiences. Gaurav and the Captions team have deep empathy for and understanding of creators from their time at Snap that we believe is critical in building within this new era of AI-enabled product development. We’re thrilled to lead Captions’ Series B and support the creation of the AI-powered creative studio.” Mike Vernal from Sequoia Capital also heaped praise on Misra and Captions in a statement, writing: “Captions use of AI for video production has the potential to revolutionize the studio experience for today’s creator economy. The user enthusiasm and loyalty Captions has experienced this early on is a clear indication of their future success.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,607 | 2,023 |
"Actors' worst fears come true? 4D Gaussian splatting is here | VentureBeat"
|
"https://venturebeat.com/ai/actors-worst-fears-come-true-new-4d-gaussian-splatting-method-captures-human-motion"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Actors’ worst fears come true? New 4D Gaussian splatting method captures human motion Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
As the Hollywood actors’ strike marches forward towards its 100th day with no resolution in sight , a technological leap has just rendered one of the actors’ biggest complaints even more possible: 3D scanning of human bodies in motion, potentially allowing for actors’ performances and mannerisms to be captured and stored as a 3D model that could be re-used by studios in perpetuity.
Although 3D scanning technology has been around in Hollywood for decades , it has typically involved a complex and time-consuming setup — multiple cameras arranged 360-degrees around an actor’s body, or, in the case of capturing motion, using ping-pong ball like “markers” placed directly on the actor and a tight-fitted bodysuit. Even recent advances using AI, such as the UK startup Move AI , generally rely on multiple cameras (though Move has a new single camera app now in limited, invitation-only release).
But now, a new method has been achieved: Gaussian splatting, a series of equations which has in recent years been used to capture static 3D imagery from a single 2D camera that is moved in a sequence around an object, has now been modified by researchers at Huawei and the Huazhong University of Science and Technology in China to capture dynamic motion in 3D as well, including human body motions.
Their method is called “4D Gaussian splatting,” because time, being the fourth dimension, is the new feature, allowing for the image to change over time.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Why motion is so tricky for Gaussian splatting 3D Gaussian splatting was devised for scanning objects with lasers in 2001 by researchers at MIT, ETH Zurich, and Mitsubishi.
It uses collections of particles to represent a 3D scene, each with its own position, rotation, and other attributes. Each point is also assigned an opacity and a color, which can change depending on the view direction. In recent years, Gaussian splatting has come a long way and can now be rendered in modern web browsers and made from a collection of 2D images on a user’s smartphone.
However, as the researchers write in a new paper published October 12 simultaneously on Github and open-access site arXiv.org , “3D-GS [Gaussian splatting] still focuses on the static scenes. Extending it to dynamic scenes as a 4D representation is a reasonable, important but difficult topic. The key challenge lies in modeling complicated point motions from sparse input.” The main challenge is that when multiple Gaussian splatters are joined together across different timestamps to create a moving image, each point “deforms” from image to image, creating inaccurate representations of the shapes and volumes of the objects (and subjects) in the images.
However, the researchers were able to overcome this by maintaining only “one set of canonical 3D Gaussians,” or images, and used predictive analytics to map where and how they would move from one timestamp to the next.
What this looks like in practice is a 3D image of a person cooking on a pan, including chopping and stirring ingredients, as well as a dog moving nearby. Another example shows human hands breaking a cookie in half and yet another opening a toy egg to reveal a nested toy chick inside. In all cases, the researchers were able to achieve a 3D rotational effect, allowing a viewer to move the “camera” around the objects in the scene in 3D and see them from multiple angles and vantage points.
According to the researchers, their 4D Gaussian splatting method “achieves real-time rendering on dynamic scenes, up to 70 FPS at a resolution of 800×800 for synthetic datasets and 36 FPS at a resolution of 1352×1014 in real datasets, while maintaining comparable or superior performance than previous state-of-the-art (SOTA) methods.
Next steps While the initial results are impressive, the scenes of motion captured by the researchers in 3D takes 20 minutes, and only last a few seconds each, far from the amount of time needed to cover an entire feature film, for example.
But, for studios looking to capture an actor’s few motions and re-use them, it’s a great start. And for video game designers, XR/VR designers, it’s hard to imagine that this technique will not be useful.
And, as with many promising technological advances, the quality and quantity of what can be captured — over what time frame — is only likely to increase.
As the researchers write at the end of their paper, “this work is still in progress and we will explore higher rendering quality on complex real scenes in the subsequent development.” GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,608 | 2,023 |
"Anonybit raises $3M to further build out biometric security | VentureBeat"
|
"https://venturebeat.com/security/anonybit-raises-3m-to-further-build-out-biometric-security-platform-genie"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Anonybit raises $3M to further build out biometric security platform Genie Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The last time we wrote about Anonybit in 2022, the New York and Tel Aviv-based biometric security startup had just emerged from stealth with a with $3.5 million round , promising to offer enhanced security to enterprises’ employee biometric data by slicing it up into anonymized bits and distributing them through a peer-to-peer network.
The method is designed to prevent hackers from grabbing employee credentials and biometrics from single repositories known as “honeypots”, and using them to impersonate a legitimate employee, gaining illicit access to the company’s systems and data.
Today, Anonybit is announcing another $3 million extension round to its seed, bringing its total raised so far to $8 million.
The latest round was led by JAM FINTOP with participation from Connecticut Innovations, and Anonybit says it will use the cash to continue building out this flagship system for securing and anonymizing identity information, now called Anonybit Genie.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Alarming headlines spur increased interest “We all read the headlines and continue to be alarmed by the rapid pace of increased fraud, which suggests that existing solutions are mere bandaids,” said Frances Zelazny, Anonybit Co-Founder and CEO, in a statement. “When thinking about the root cause, it boils down to two things. Personal data is stored in central honeypots that are impossible to protect and we allow the use of this data to authenticate ourselves. It’s a never-ending cycle. Many players in the industry aim to expand their offerings to address more comprehensive use cases in identity, but only Anonybit can get us out of the current paradigm by making it safe to store biometrics.” Zelanzy previously served in the 8200 unit within the Israel Defense Force and worked on BioCatch and Tapingo.
Anonybit builds upon his previous experience in biometrics and payments, offering a “Decentralized Biometrics Cloud,” the Anonybit Genie system of passwordless authentication for employees built atop this cloud, and a decentralized data vault.
The decentralized biometrics cloud uses multi-party computing and zero-knowledge proofs, according to the company’s website.
Meanwhile, Anonybit Genie enables authentication by checking to see if each of the anonymized bits provide their portion of the requisite security credentials, without ever re-assembling the data into a complete whole. It supports Security Assertion Markup Language (SAML).
“Working across devices and applications, Anonybit returns an authentication response without any of the original biometric data components,” the company states.
In addition, the decentralized data vault allows enterprises to store non-biometric yet still sensitive data in another anonymized layer running atop the cloud.
Of course, Anonybit’s tools are all SOC 2 Type II compliant , and the company is also ISO 27001 certified.
New appointments Anonybit also announced the appointment of two new leaders: Limor Elbaz, who previously worked in sales leadership at Imperva and Peerlyst, has been named Chief Revenue Officer. L Meanwhile, Al Pascual, a former industry analyst at Javelin Research and co-founder of Breach Clarity, is joining Anonybit’s advisory board.
Armed with its new funding and commitment to enhancing security through anonymization and decentralization of data, Anonybit looks poised to continue growing and gaining a larger foothold in the fast-moving space of enterprise software security.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,609 | 2,023 |
"Verkada unveils privacy updates to its security system and cameras | VentureBeat"
|
"https://venturebeat.com/ai/verkada-unveils-privacy-updates-to-its-security-system-and-cameras"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Verkada unveils privacy updates to its security system and cameras Share on Facebook Share on X Share on LinkedIn Credit: VentureBeat made with Midjourney Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
The physical security industry stands at a crossroads.
Video surveillance and analytics have rapidly transitioned to the cloud over the past decade, bringing enhanced connectivity and intelligence. But these same innovations also enable new potential for mass data collection, profiling and abuse.
As one of the sector’s leading cloud-based providers, Verkada , which offers a range of physical security measures including AI-equipped remote monitoring cameras, controllers, wireless locks, and more, is attempting to chart a privacy-first path forward amidst these emerging tensions.
The San Mateo-based company, which has brought over 20,000 organizations into the cloud security era, plans to roll out features focused on protecting identities and validating footage authenticity.
Set to launch today, the updates come at a pivotal moment for society and the way we exist in public and private places. Verkada has drawn significant backlash for past security lapses and controversial incidents.
However, its ability to balance innovation with ethics will reveal how it navigates the turbulent physical security industry.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! Obscuring identities, validating authenticity In an interview with Verkada founder and CEO Filip Kaliszan, he outlined the motivation and mechanics behind the new privacy and verification features.
“Our mission is protecting people and property in the most privacy sensitive way possible,” Kaliszan said. “[The feature release] is about that privacy sensitive way of accomplishing our goal.” The first update focuses on obscuring identities in video feeds. Verkada cameras will gain the ability to automatically “blur faces and video streams” using principles similar to augmented reality filters on social media apps.
Kaliszan noted security guards monitoring feeds “don’t really need to see all these details” about individuals until an incident occurs.
Making blurring the “default path” where possible is a priority, with the goal being “most videos washed with identities obfuscated.” In addition to blurring based on facial recognition, Verkada plans to implement “hashing of the video that we’re capturing on all of our devices…So we’re creating, you can think of it like a signature of the contents of the video as it is captured,” Kaliszan explained.
This creates a tamper-proof digital fingerprint for each video that can be used to validate authenticity.
Such a feature helps address growing concerns around generative AI, which makes it easier to fake or alter footage.
“We can say this video is real. It came out of one of our sensors and we have proof of when it was captured and how, or hey there is no match,” Kaliszan said.
For Kaliszan, adding privacy and verification capabilities aligns both with ethical imperatives and Verkada’s competitive strategy.
“It’s a win-win strategy for Verkada because on the one hand, you know, we’re doing what we believe is right for society,” he argued. “But it’s also very wise for us,” in terms of building customer trust and preference, he said.
Questions raised about protecting privacy While Kaliszan positioned Verkada’s new features as a step toward protecting privacy, civil society critics argue the changes do not go nearly far enough.
“If you’re doing it where it can be undone — you can undo it later — you’re still collecting that very intrusive information,” said Merve Hickok, president of the independent nonprofit Center for AI and Digital Policy.
Rather than merely blurring images temporarily, Hickok believes companies like Verkada should embrace a “privacy enhancing approach where you’re not collecting the data in the first place.” Once collected, even obscured footage enables tracking via “location data, license plate readers, heatmapping.” Hickok argued Verkada’s incremental changes reflect an imbalance of priorities. “The security capabilities are so good, so it’s like yeah, go ahead and collect it all, we’ll blur it for now,” she said. “But then the individual rights of the people walking around are not protected.” Without stronger regulations, Hickok believes we are on a “slippery slope” toward ubiquitous public surveillance. She advocated for legal prohibitions on “real time biometric identification systems in public spaces,” similar to those being debated in the European Union.
A collision of perspectives on ethics and tech Verkada finds itself at the center of these colliding perspectives on ethics and technology. On one side, Kaliszan aims to show security can be “privacy sensitive” through features like blurring.
On the other, civil society critics like Hickok question whether Verkada’s business model can ever fully align with individual rights.
The answer holds major implications not just for Verkada, but the broader security industry. As physical security transitions to the cloud, companies like Verkada are guiding thousands of organizations into new technological terrain. The choices they make today around data practices and defaults will ripple far into the future.
That power comes with obligation, Hickok argues. “We’re way closer to enabling the fully surveyed society than we are from a fully private and protected society,” she said. “So I think we do need to have that security measure but maybe the takeaway here is the companies just need to be very cogent.” For Verkada, cogency means advancing security while avoiding mass surveillance. “When all of it comes together, that privacy consideration further increases, right?” Kaliszan said. “And so thinking through how do we maintain privacy, how do we tie identity locally, doing the processing on the edge and not building a mass surveillance system.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,610 | 2,023 |
"These are the top VC locations in the USA for startups — and jobs | VentureBeat"
|
"https://venturebeat.com/programming-development/these-are-the-top-vc-locations-in-the-usa-for-startups-and-jobs"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Jobs These are the top VC locations in the USA for startups — and jobs Share on Facebook Share on X Share on LinkedIn It will surprise few that San Francisco is deemed to have the most developed VC ecosystems in the world. That’s according to a new report issued by financial data and software company PitchBook , which also found that the U.S. and Asia account for 85% of the 20 most-developed VC ecosystems, and 65% of the 20 highest-growth VC ecosystems are in Europe or the U.S.
It’s no secret that securing VC funding has become increasingly challenging over the last 18 months. Startups and scaling tech companies have shed staff numbers en masse, as without the guarantee of fresh investment a renewed focus on profitability is seen across the board.
However, there are always locations that buck trends and investment seekers and job hunters in the U.S. would do well to set their sights on San Francisco and New York in particular.
Though a considerable volume and value of VC activity moves through both cities, New York averaged less than half the startup investment that San Francisco attracted over a six year period (Q3 2017 – Q2 2023), securing $153.2bn investment versus The Golden City’s $364.5bn.
Los Angeles ($123.1bn) and Boston ($99.2bn) also ranked highly in terms of development, and joined cities like Beijing, Shanghai, London, Shenzhen, Seoul and Hangzhou in the top ten. Washington DC, Seattle, Austin, San Diego and Denver made up the U.S. contingent of the top 20 most developed VC ecosystems.
Locations to watch The report also examined growth rates of activity. Detroit came second to Dubai in the growth rankings as the city emerged as a tech hub, while Raleigh and Houston completed the top five alongside Berlin.
Additionally Indianapolis, Miami, Philadelphia and St Louis also landed spots in the top 20.
Across overall scores for development and growth, North America nabbed 20 of the top 50 spots, followed by Asia (16), Europe (12) and the Middle East (1), making it the most attractive region for VC investment.
And of course, wherever funding is, jobs are. The VentureBeat Job Board is continuously updated with tech jobs across the U.S., with remote and hybrid opportunities also advertised. Take a look at these three roles and visit the job board for more.
Senior Software Engineer (Java), Account Management Platform (Hybrid), ThousandEyes, San Francisco Acquired by Cisco in 2020, ThousandEyes is a performance monitoring stack for remote and hybrid workers that monitors internet challenges that disrupt worker productivity, impacts revenue and brand reputation. The organization is seeking a Senior Software Engineer for its Account Management Platform team; you’ll be responsible for maintaining critical platform APIs such as user management, authentication and contract enforcement. Ideally the candidate will have over six years of software development experience, will be comfortable working with newer technologies and have expert level understanding of object-oriented programming languages (Java, C++, etc).
See more here.
Solutions Engineer, Pryon, New York Having recently raised $100m in a funding round led by Thomas Tull’s U.S. Innovative Technology Fund, start-up Pryon is developing an AI-powered platform to analyze enterprise data. Currently seeking a Solutions Engineer who can balance technical expertise with strong interpersonal skills, this role is perfect for someone who is strong in AI/<L/NLP and software development, but also in client-facing situations. The successful candidate will lead technical activities, develop and scope solutions, create custom demos and lead customer trials. Preferred experience includes knowledge of development and deployment across multiple cloud providers such as Amazon Web Services, Microsoft Azure, Google Cloud, VMWare and OpenStack, and at least two years professional experience with Golang, Python, Javascript or other programming languages.
Find out more now.
Applications Engineer, Richmond American Homes, Denver Property developers Richmond American Homes, part of MDC Holdings, Inc, is advertising for a full-time E1 Applications Engineer with experience in the Oracle JDE EOne ERP environment. In this role, the successful candidate will design, develop and maintain applications and systems in line with best practices and standards. By collaborating with JDE EnterpriseOne functional consultants and development teams, this engineer will create integrated solutions that are scalable and adaptable. A bachelor’s degree in computer science, computer engineering, information systems or equivalent experience is ideal, as is 3+ years’ experience with Oracle JD Edwards E1 development toolset.
Apply here.
Interested in joining a scaling start-up? Check out the VentureBeat Job Board today.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,611 | 2,023 |
"How to ask for more money as a software developer | VentureBeat"
|
"https://venturebeat.com/programming-development/how-to-ask-for-more-money-as-a-software-developer"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Sponsored Jobs How to ask for more money as a software developer Share on Facebook Share on X Share on LinkedIn As tech redundancies continue to hit the headlines, it’s not exactly a straightforward time to ask for a salary increase. Throughout 2023, organizations of all sizes have streamlined products and services, cutting developer roles that were previously thought as untouchable.
If 2022 was defined by cuts in customer success, HR, product and talent acquisition, much of 2023 has been hugely focused on engineering teams.
Despite this, software developer roles continue to be in high demand, particularly for those with expertise in data science, AI, machine learning and blockchain technology. The value good software developers can bring is still unarguable.
Whether you’ve taken on more responsibility in a leaner team or you’re in an organization that is fully thriving, it could very well be the right time to ask for more compensation.
Here’s what to ask yourself before scheduling that meeting.
1. Does the organization have a standardized review policy? Some companies like to conduct reviews in a particular month of the year, while for others it’s centered around the time each individual employee started with the organization.
Ask HR if you’re uncertain, and be prepared in time for the designated month or your work anniversary. Asking for an early review will add to your manager’s workload and may not help your case.
2. Why do you deserve a salary increase? You don’t need a snappy 30-second pitch prepared, but do have a list of achievements and positive feedback ready to support your case.
Be able to talk about them conversationally, and if your manager is happy to recommend a salary raise for you, follow up with a thank-you email, and include these bullet points to make their job that bit easier.
3. What are your manager’s pain points? Speaking of making your manager’s job easier, if you meet hesitation or resistance in your review meeting, take this as an opportunity to probe further. You may discover the amazing achievements you’ve focused on are not a current business priority, and there may be an area where you can add more value.
Let’s say you could automate a process that normally takes your manager a few hours a month, this could really help further your case and bolster goodwill overall.
4. Are you open to other perks and benefits in lieu of a cash increase? Depending on how your organization is doing, there may be a freeze on salary increases. Think of what perks and benefits you might settle for in lieu of cold hard cash. In start-ups and scale-ups, equity may cost your employer little to give now, but may pay dividends to you in the future.
Or perhaps unlimited PTO, a condensed work week, Summer Fridays or prepaid gift or credit cards may appeal.
5. Have you researched a benchmark? On the VentureBeat Job Board , you can search by job title, keywords and region. Use this to get a sense of what similar jobs are being offered elsewhere, and go in with a realistic and well-deserved figure.
If your efforts to land a salary increase are not rewarded, and you think it’s time to move on, visit the VentureBeat Job Board, bookmark it and check in weekly to see what’s being offered in your area of expertise.
Here are a few! Accounting Data, Reporting & Automation Manager Circle, Chicago, IL/remote Global financial technology firm Circle is seeing an exceptional Accounting Data, Reporting & Automation Manager to be a manager-level individual contributor. Experience in both data automation and accounting operations is required, and the successful candidate will partner with accounting and cross-functional team members to implement automated solutions as the business scales. Identifying data and and reporting solutions to automate accounting reconciliation supporting month-end close, stable coin activity reconciliation and key financial reports are the key responsibilities of this role. Seven-plus years’ of accounting experience in data analytics within an accounting department is outlined as important, as is experience with database design, an understanding of complex data schema and data extraction. This role is open to remote applicants.
Find out more here.
Software Architect, Rosen Group, Columbus, OH Founded in Switzerland in 1981, Rosen now has bases in Germany and Columbus. A research and product development company that’s privately family-owned, the group is now hiring a Software Architect.
Advertised as “hybrid remote”, there is an expectation to live in Columbus, Ohio, as the successful applicant will be leading the Columbus Creation team with interaction across all North American locations. Creating a vision and solution architect for the product is key to this role, as is designing for extensibility, and solving targeted use cases with optimal technical approaches. Proven experience as a Software Solutions Architect is required, as is a Bachelor’s degree in computer science, software engineering or a related field.
See more about this position here.
Solutions Engineer — AI GRC, Holistic AI, Remote Holistic AI helps organizations monitor and manage Governance Risk and Compliance (GRC) for AI, ensuring use of AI is ethical, responsible, and compliant with regulations. Currently experiencing huge growth, this innovative company is seeking a dynamic and talented Solutions Engineer.
This new role is crucial in the sales process, and the successful candidate will combine technical expertise with exceptional communication skills. Working side-by-side with sales, the successful candidate will deliver engaging demonstrations, engineer cutting-edge solutions, and serve as a technical advisor to clients.
Apply for this remote role here.
New jobs are added to the VentureBeat Job Board every week, browse opportunities here.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
2,612 | 2,023 |
"Zoom's AI Companion can summarize meetings for late attendees | VentureBeat"
|
"https://venturebeat.com/virtual/zoom-launches-ai-companion-to-summarize-meetings-for-late-attendees"
|
"Artificial Intelligence View All AI, ML and Deep Learning Auto ML Data Labelling Synthetic Data Conversational AI NLP Text-to-Speech Security View All Data Security and Privacy Network Security and Privacy Software Security Computer Hardware Security Cloud and Data Storage Security Data Infrastructure View All Data Science Data Management Data Storage and Cloud Big Data and Analytics Data Networks Automation View All Industrial Automation Business Process Automation Development Automation Robotic Process Automation Test Automation Enterprise Analytics View All Business Intelligence Disaster Recovery Business Continuity Statistical Analysis Predictive Analysis More Data Decision Makers Virtual Communication Team Collaboration UCaaS Virtual Reality Collaboration Virtual Employee Experience Programming & Development Product Development Application Development Test Management Development Languages Zoom launches AI Companion to summarize meetings for late attendees Share on Facebook Share on X Share on LinkedIn Credit: Zoom Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Zoom, the videoconferencing and messaging platform, is getting into the built-in generative AI assistant game , according to a blog post on Tuesday. It said it will rebrand its Zoom IQ as the Zoom AI Companion and users already subscribed to the paid services will have access to the new features right away.
Mapping to the challenges found in a normal workday, Zoom says its “AI Companion” can equip users with contextual and useful intelligence.
What Zoom’s AI Companion offers According to the blog post , the AI assistant can help compose chat responses, saving time and allowing users to focus on important projects. If someone joins a meeting late, AI Companion can catch them up by summarizing the discussions. It can also answer specific questions about meeting content and automatically generate summaries, identify important information, and create next steps for attendees.
Zoom plans to expand AI Companion’s capabilities in the future. Users will be able to interact with the assistant through natural language queries and receive help with various tasks, such as meeting preparation, summarizing chat messages, consolidating meeting summaries, composing emails and finding relevant documents. Additionally, AI Companion will assist in filing support tickets, providing real-time information during meetings, and analyzing phone calls and messages.
VB Event The AI Impact Tour Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you! As well, the post outlined some new details on how Zoom uses a “federated approach” to its AI model selection. It hopes the AI Companion will deliver improved results by “dynamically incorporating Zoom’s own large language model (LLM) in addition to Meta Llama 2, OpenAI, and Anthropic,’ it said.
Because of this strategy, Zoom users avoid the challenge of picking the right model to enjoy the latest features and benefits.
AI assistant training data remains controversial The release of the AI assistant follows public outcry after Zoom updated its terms of services to include details on how it intends on using data collected to train the models which will power the automated processes. Originally, the terms were interpreted to say that Zoom could leverage the data generated by the people who use the platform and there would be no way to opt out.
In a blog post written to respond to the uproar, Zoom said that “as part of our commitment to transparency and user control, we are providing clarity on our approach to two essential aspects of our services: Zoom’s AI features and customer content sharing for product improvement purposes. Our goal is to enable Zoom account owners and administrators to have control over these features and decisions, and we’re here to shed light on how we do that.” Those points were reiterated in the post announcing the AI assistant: “Zoom does not use any of your audio, video, chat, screen sharing, attachments, or other communications like customer content (such as poll results, whiteboard, and reactions) to train Zoom’s or third-party artificial intelligence models,” it read.
Disabled by default In the initial release of the Zoom AI Companion, admins and account operators will be able to turn on and off certain AI-enable features and all capabilities will be disabled by default. Even when the features are turned on by account admins, hosts will have further fine grained controls found in each meeting. Participants in the meetings will also get to see the status of the in-use AI tools.
Zoom is no stranger to controversies around the use of AI in its products. In April 2022, the company came under fire after saying it might soon include emotion AI features in its sales-targeted products. A nonprofit advocacy group, Fight for the Future, published an open letter to the company saying that Zoom’s possible offering would be a “major breach of user trust,” is “inherently biased,” and “a marketing gimmick.” VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
Discover our Briefings.
The AI Impact Tour Join us for an evening full of networking and insights at VentureBeat's AI Impact Tour, coming to San Francisco, New York, and Los Angeles! VentureBeat Homepage Follow us on Facebook Follow us on X Follow us on LinkedIn Follow us on RSS Press Releases Contact Us Advertise Share a News Tip Contribute to DataDecisionMakers Careers Privacy Policy Terms of Service Do Not Sell My Personal Information © 2023 VentureBeat.
All rights reserved.
"
|
Subsets and Splits
Wired Articles Filtered
Retrieves up to 100 entries from the train dataset where the URL contains 'wired' but the text does not contain 'Menu', providing basic filtering of the data.