logo
How To Implement MLSecOps In Your Organization

How To Implement MLSecOps In Your Organization

Forbes3 days ago

Neel Sendas is a Principal Technical Account Manager at Amazon Web Services (AWS).
As a cloud operations professional focusing on machine learning (ML), my work helps organizations grasp ML systems' security challenges and develop strategies to mitigate risks throughout the ML lifecycle.
One of the key aspects of solving these challenges is machine learning security operations (MLSecOps), a framework that helps organizations integrate security practices into their ML development, deployment and maintenance.
Let's look at ML systems' unique security challenges and how MLSecOps can help to address them.
Understanding vulnerabilities and implementing robust security measures throughout the ML lifecycle is crucial for maintaining system reliability and performance.
For instance, data poisoning, adversarial attacks and transfer learning attacks pose critical security risks to ML systems. Cornell research shows that data poisoning can degrade model accuracy by up to 27% in image recognition and 22% in fraud detection. Likewise, subtle input modifications during inference—also known as adversarial attacks—can completely misclassify results. Transfer learning attacks exploit pre-trained models, enabling malicious model replacement during fine-tuning.
MLSecOps—which relies on effective collaboration between security teams, engineers and data scientists—is an important aspect of addressing these evolving challenges.
This framework protects models, data and infrastructure by implementing security at every stage of the ML lifecycle. The foundation includes threat modeling, data security and secure coding, and it is enhanced by techniques like protected model integration, secure deployment, continuous monitoring, anomaly detection and incident response.
Implementing MLSecOps effectively requires a systematic approach to ensure comprehensive security throughout the ML lifecycle.
The process begins with assessing the security needs of ML systems, which involves identifying data sources and infrastructure, evaluating potential risks and threats, conducting thorough risk assessments and defining clear security objectives.
Working with large organizations, I've found that incorporating MLSecOps into an organization's existing security practices and tools can be complex, requiring a deep understanding of both traditional cybersecurity practices and ML-specific security considerations.
Additionally, certain industries and jurisdictions have specific regulations and guidelines regarding the use of AI and ML systems, particularly in areas like finance, healthcare and criminal justice. Understanding these regulations and ensuring compliance may be challenging for those without MLSecOps expertise.
Next, you'll need to establish a cross-functional security team that combines data scientists, ML engineers and security experts.
Once the team has been established, define comprehensive policies and procedures, including security policies, incident response procedures and clear documentation and communication guidelines. Implementing such policies can be challenging, as it requires orchestrating various teams with diverse expertise and aligning their efforts toward a common goal.
To address this challenge, organizations can develop a clear governance model that outlines the roles, responsibilities, decision-making processes and communication channels for all parties involved. This governance framework should be regularly reviewed and updated as necessary.
I recommend that the team take a step back to adapt MLSecOps to their organizational needs.
One way to do this is to understand the five pillars of MLSecOps—supply chain vulnerability, ML model provenance, model governance and compliance, trusted AI and adversarial ML—laid out by Ian Swanson in a Forbes Technology Council article—and understand how they will impact your organization.
Once you've understood those specific processes, ensure that you've built a secure development lifecycle through integrated measures and secure coding.
Security monitoring and response activities are also essential, which involve deploying monitoring tools and incident response plans to monitor ML workloads and detect threats. Beyond using tools, companies like Netflix use "chaos engineering" to inject failures into production to validate security controls as well as incident response effectiveness.
Regular audits and assessments will be crucial, but implementing employee training on risks and vigilant practices is one of the most important tasks—and one of the most difficult to achieve.
Securing buy-in for training programs from non-security stakeholders is often complicated. To overcome this, I've found that collaboration with leadership can help position security training as a strategic, shared responsibility. I've also found that tailoring programs to specific roles can make the training more relevant and engaging.
In my experience, implementing comprehensive security throughout the ML lifecycle requires a combination of strategic planning, collaboration across teams, continuous learning and adaptation and a strong focus on building a security-conscious culture.
Organizations seeking comprehensive MLOps security can achieve end-to-end protection by following established security best practices. This approach safeguards against various threats, including data poisoning, injection attacks, adversarial attacks and model inversion attempts.
As ML applications grow more complex, continuous monitoring and proactive security measures become crucial. This robust security framework enables organizations to scale their ML operations confidently while protecting assets and accelerating growth.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

AMD Runs Circles Around Intel With Helios Rack-Scale AI Systems
AMD Runs Circles Around Intel With Helios Rack-Scale AI Systems

Yahoo

time34 minutes ago

  • Yahoo

AMD Runs Circles Around Intel With Helios Rack-Scale AI Systems

Intel has largely given up on selling AI accelerators directly and is focusing on rack-scale solutions. The AI strategy is still in flux, though, as new CEO Lip-Bu Tan revamps the company. Meanwhile, AMD is set to roll out its new Helios rack-scale AI solution next year, likely beating Intel to market. 10 stocks we like better than Intel › Intel (NASDAQ: INTC) made a run at the artificial intelligence (AI) accelerator market with its Gaudi line of chips, which came with the company's $2 billion acquisition of Habana. Unlike the graphics processing units (GPUs) from Nvidia and AMD (NASDAQ: AMD), Gaudi featured a different type of architecture that excelled in certain workloads. While Gaudi 3 wasn't up to par with Nvidia's leading AI accelerators when it launched, Intel offered attractive pricing as a way to lure customers. While Gaudi 3 won some high-profile customers, namely IBM for its watsonx platform, the chip was mostly a bust. Intel set a $500 million AI accelerator sales target for 2024, already a fraction of what Nvidia and AMD sell annually, and failed to meet it. While Gaudi's pricing was attractive, an immature software ecosystem, an unfamiliar architecture, and a complicated roadmap from Intel conspired to keep customers away. Intel later abandoned Falcon Shores, which was expected to be a traditional GPU that integrated some of Gaudi's features and was set to launch in 2025. Falcon Shores will no longer be a commercial product, and the new plan is to use its successor, Jaguar Shores, as the base for rack-scale AI solutions. This strategy makes sense. As AI infrastructure companies scale up data centers to include more densely packed accelerators, rack-scale solutions, which integrate GPUs, central processing units (CPUs), and other hardware across an entire server rack, can help solve thorny problems that are holding GPU clusters back. Essentially nothing is known about what Intel's rack-scale solutions are going to look like. With CEO Lip-Bu Tan taking over just a few months ago and ready to shake up the struggling company, it's not even clear whether Intel's AI strategy will change again. Tan installed a new chief technology officer (CTO) and AI chief in April who's in charge of Intel's overall AI strategy and product roadmap. Jaguar Shores almost certainly won't be ready until 2026, so any rack-scale solutions likely won't be either. Given all the changes going on at Intel, including upcoming layoffs meant to streamline the company, it could take Intel quite some time to land on a viable AI strategy and actually get competitive products and solutions to market. Meanwhile, competitor AMD is firing on all cylinders. AMD announced its next-generation Helios rack-scale AI solution earlier this month. Set for a 2026 launch alongside its powerful MI400 AI accelerator family, AMD plans to combine up to 72 GPUs with Venice EPYC CPUs that feature as many as 256 cores each, all tied together with UALink, an open interconnect standard. Helios will also feature AMD's Volcano AI networking cards from Pensando, which AMD acquired in 2022. These high-throughput network interface cards (NICs) will help with data transfer across large-scale AI deployments. AMD already has Oracle as a customer for its current rack-scale solution, which features the company's MI355X GPUs. Additionally, OpenAI is planning to use AMD's upcoming MI400 chips, according to CEO Sam Altman. While AMD remains in a distant second place behind Nvidia in the AI chip market, interest in its solutions seems to be picking up. By the time Intel manages to launch rack-scale AI solutions of its own, AMD may already have gained a significant foothold. That will make Intel's job harder as it tries to land on an AI strategy that works. While Intel is struggling to build an AI chip business of its own, the company's foundry business could ultimately be a major beneficiary of the AI chip boom. The Intel 18A process is going into volume production later this year, and the company's advanced packaging technology is getting plenty of interest from potential customers. In just a few years, Intel has staged an incredible comeback in manufacturing, closing the performance and efficiency gap with TSMC. If AI chip demand continues to boom, Nvidia, AMD, and the slew of tech giants designing custom AI chips could all seriously consider Intel for manufacturing. While Intel has failed to make a dent in the AI chip market so far, its foundry business could still allow it to participate in the AI boom. Before you buy stock in Intel, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Intel wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $659,171!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $891,722!* Now, it's worth noting Stock Advisor's total average return is 995% — a market-crushing outperformance compared to 172% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 9, 2025 Timothy Green has positions in Intel. The Motley Fool has positions in and recommends Advanced Micro Devices, Intel, Nvidia, Oracle, and Taiwan Semiconductor Manufacturing. The Motley Fool recommends the following options: short August 2025 $24 calls on Intel. The Motley Fool has a disclosure policy. AMD Runs Circles Around Intel With Helios Rack-Scale AI Systems was originally published by The Motley Fool Sign in to access your portfolio

2 No-Brainer Stocks to Profit Off the AI Boom
2 No-Brainer Stocks to Profit Off the AI Boom

Yahoo

timean hour ago

  • Yahoo

2 No-Brainer Stocks to Profit Off the AI Boom

Oracle is experiencing accelerating growth in its cloud business, sending the stock to new highs. Broadcom is benefiting from the growing demand for semiconductors and networking switches for advanced AI workloads in data centers. 10 stocks we like better than Oracle › Artificial intelligence (AI) continues to promise great returns for long-term investors in companies putting it to use in their businesses. Every industry is exploring how this technology can boost efficiency and innovation. This is expected to boost the global economy by trillions of dollars in the coming years. Here are two top tech stocks to buy that benefit from growing investment in AI services and hardware. Oracle (NYSE: ORCL) stock recently surged to a new high following an impressive earnings report for its fiscal fourth quarter. The stock has had an incredible run over the past few years, but its accelerating growth in cloud services makes the stock a compelling buy even at these all-time highs. Oracle is a leader in offering applications and database services for enterprise. Its competitive moat is based on a comprehensive suite of services that work well together, and this integration of services is driving strong demand for its cloud and AI offerings. Cloud revenue grew 27% year over year last quarter, and it's expected to accelerate. Management projects fiscal 2026 cloud revenue growing over 40% compared to fiscal 2025. This is causing analysts to raise their full-year earnings estimates. Earnings per share are now projected to reach $6.75 for fiscal 2026 and climb to $9.92 by fiscal 2028. Oracle is in a strong competitive position because it offers companies the ability to use their own data with popular AI large language models while maintaining security. Revenue from cloud infrastructure services grew 52% over the year-ago quarter, and management expects this business to accelerate over the next year. Its involvement in the Stargate project with OpenAI, which promises to build $500 billion worth of AI infrastructure in the U.S. over the next four years, supports attractive growth prospects. Oracle stock looks expensive from a valuation perspective, with the stock trading at a high multiple of earnings. But it's also seeing accelerating growth. The stock should climb higher over the next few years as it rides the wave of investment pouring into AI cloud services. Strong demand for top cloud providers is good news for Broadcom (NASDAQ: AVGO), which supplies semiconductors, software, and networking products for data centers. It also supplies chips for other markets, including smartphones. Broadcom has a great record of delivering strong growth and returns to shareholders. Over the last 10 years, revenue and earnings grew at an annualized rate of 28%. This reflects management's strategy of investing in the most attractive opportunities that offer profitable long-term growth. Right now, it's focusing on the demand for AI infrastructure. Broadcom's custom AI silicon, including application-specific integrated circuits (ASICs) and eXtreme processing units (XPUs), is seeing robust demand. AI semiconductor revenue grew 46% year over year last quarter, reaching $4.4 billion, or 30% of the company's total revenue. Data centers are aggressively investing in chips and networking to prepare for the shift from AI training to inferencing, where models make predictions from new data. Management expects this shift to lead to accelerating demand for XPUs through 2026. Broadcom's networking products for AI delivered even higher growth, surging 170% year over year. Its new Tomahawk 6 Ethernet switch can deliver 102.4 terabits per second of data capacity. This will drive higher performance in training the next wave of cutting-edge AI models. The stock usually trades at an expensive-looking earnings multiple. The average price-to-earnings (P/E) ratio is 55 since 2020. On a forward earnings basis, the forward P/E currently sits at 37. This suggests that there is more upside for the stock over the next year, barring a severe recession or anything that might disrupt spending in the semiconductor industry. Given Broadcom's history of delivering strong growth, it should be a solid stock to profit from the AI boom. Before you buy stock in Oracle, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Oracle wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $659,171!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $891,722!* Now, it's worth noting Stock Advisor's total average return is 995% — a market-crushing outperformance compared to 172% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 9, 2025 John Ballard has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Oracle. The Motley Fool recommends Broadcom. The Motley Fool has a disclosure policy. 2 No-Brainer Stocks to Profit Off the AI Boom was originally published by The Motley Fool Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

OpenAI Careens Toward Messy Divorce From Microsoft
OpenAI Careens Toward Messy Divorce From Microsoft

Yahoo

timean hour ago

  • Yahoo

OpenAI Careens Toward Messy Divorce From Microsoft

Quick, someone call TMZ: There's an even messier celebrity breakup than Trump-Musk with countless billions at stake. Relations between OpenAI and its largest investor, Microsoft, are continuing to fray as the ascendant artificial intelligence firm struggles to get approval from its investor on the fine points of reorganizing into a for-profit public benefit corporation. According to a report in The Wall Street Journal last week, OpenAI is now even considering a 'nuclear option' to sever ties for good. READ ALSO: Drones Steal the Paris Air Show and Berkshire Slips Amid Concern Retiring CEO May Take 'Buffett Premium' With Him At the heart of the matter is how large a stake Microsoft will own in OpenAI's public benefit corporation, a subsidiary of the ChatGPT maker that will still be controlled by the nonprofit parent. According to a recent Reuters report, OpenAI wants the Big Tech player to hold a 33% stake while relinquishing its rights to future profits. Microsoft hasn't agreed, and the two sides are at loggerheads over the matter, though it's far from their first fight. Microsoft already loosened its grip on the AI firm in January, allowing some key terms of their agreement to change so that OpenAI could tap data centers outside of the Microsoft Azure infrastructure. That resulted in 'Stargate,' a high-profile $500 million data center joint venture between OpenAI, Softbank, and Oracle (with Microsoft, Nvidia, and Arm serving as 'technology partners'). But opening the door to more independence may be objectionable to Microsoft: Earlier this month, Reuters reported OpenAI is now planning to tap Google's cloud services to meet its growing need for computing capacity. In another line-crossing move, The Information reported last week that OpenAI has been offering a suite of ChatGPT enterprise tools at discounts of up to 20%, directly undercutting sales of competing Microsoft services like Copilot. In other words, tensions are at an all-time high, and now both sides are throwing around fighting words at a time when level-headed communication is crucial: According to the WSJ, OpenAI executives have discussed a 'nuclear option' of formally accusing Microsoft of antitrust violations if it can't come to an agreement with the Windows-maker over transition terms. According to a Financial Times report published Wednesday, Microsoft is prepared to walk away from negotiations altogether and simply ride out its existing commercial contract with OpenAI, which is set to last until 2030. That would leave OpenAI stuck with its current structure, which means it'd lose out on half of the $40 billion investment SoftBank committed to making in April, which was contingent on a successful restructuring to for-profit by the end of the year. Burn Book: Microsoft isn't the only Big Tech firm OpenAI has waded into a blood feud with. The company now finds itself openly at war with Meta, which is offering $100 million signing bonuses to poach OpenAI talent as it seeks to bolster its AI efforts. OpenAI founder Sam Altman last week jabbed back by saying, 'I don't think that [Meta's] great at innovation.' At this rate, we have to imagine Altman isn't exactly the most popular player at our imagined weekly poker night among Silicon Valley bigwigs (actually, let's be honest, they're probably playing Magic: The Gathering). This post first appeared on The Daily Upside. To receive delivering razor sharp analysis and perspective on all things finance, economics, and markets, subscribe to our free The Daily Upside newsletter. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store