logo
#

Latest news with #ChrisCox

Apiture Introduces Fintech Connector to Accelerate Innovation for Community Banks and Credit Unions
Apiture Introduces Fintech Connector to Accelerate Innovation for Community Banks and Credit Unions

Business Wire

time05-06-2025

  • Business
  • Business Wire

Apiture Introduces Fintech Connector to Accelerate Innovation for Community Banks and Credit Unions

WILMINGTON, N.C.--(BUSINESS WIRE)-- Apiture, a leading provider of digital banking solutions, today announced the launch of Fintech Connector, enabling fintech partners to rapidly integrate their solutions with the Apiture Digital Banking Platform, without the need for custom integrations. In today's dynamic banking landscape, financial institutions are looking to harness fintech innovation to meet the evolving expectations of their account holders. With Fintech Connector, fintech partners can integrate directly with the Apiture Consumer Banking and Business Banking solutions, enabling Apiture clients to select and deploy new services that align with the needs of their account holders and their institution's strategy. Likewise, Fintech Connector empowers financial institutions to deploy features they have built in-house directly into the Apiture Digital Banking Platform. Fintechs taking advantage of Fintech Connector benefit from rapid deployment through a single integration point as well as exposure to hundreds of Apiture community bank and credit union clients. The first partner to integrate with the Apiture platform through Fintech Connector is InvestiFi, a digital investing provider that enables account holders to buy and sell investments directly from their checking accounts. The fintech eliminates the need to move money to external parties to invest, helping banks and credit unions keep deposits within their institutions. Apiture clients can now access InvestiFi directly from Fintech Connector. 'InvestiFi couldn't be more excited to be the first fintech integrated to Apiture through its new Fintech Connector program,' said InvestiFi CEO Kian Sarreshteh. 'With InvestiFi's patent pending Investing from Checking solution now integrated into Apiture's online and mobile banking solutions, Apiture's extensive ecosystem of banks and credit unions can drive new streams of non-interest income, deposit growth, and digital engagement. It's been inspiring watching our teams collaborate to get an innovative product to market, and we are thrilled to have Apiture as a strategic partner for InvestiFi.' 'At Apiture, we pride ourselves on our ability to quickly deliver meaningful innovation to our clients,' said Apiture Chief Operating Officer Chris Cox. 'Fintech Connector complements our ongoing development efforts, providing a new way for banks and credit unions to tailor the digital banking experience to meet the needs of their communities. We are excited to offer this integration option that benefits both fintechs and financial institutions alike.' Apiture's Fintech Connector is now live and open to new fintech partnerships. About Apiture Apiture delivers award-winning digital banking solutions to banks and credit unions throughout the U.S. Our flexible, highly configurable solutions meet a wide range of financial institutions' needs, from leveling the playing field with larger institutions to supporting growth through innovative data intelligence and embedded banking strategies. With our API-first approach, our clients can maximize the capabilities of their platform while preserving a seamless user experience. Our exclusive focus on digital banking, and a team with hundreds of years of collective experience working at U.S. financial institutions, means we are dedicated to meeting the unique needs of our clients while providing a level of support that is unmatched in the industry. Apiture is headquartered in Wilmington, North Carolina. To learn more, visit About InvestiFi InvestiFi, Inc. is the only InvestTech Platform designed to allow for trading to and from deposit accounts, enabling credit unions and community banks to retain more assets and attract new account holders. Through its exclusive funds flow and user-friendly interface, InvestiFi empowers every credit union and community bank to provide their account holders with the ability to navigate the complexities of financial markets with ease from within their current online banking experience. At the heart of InvestiFi's mission is the goal of democratizing investing and supporting community financial institutions, ensuring that wealth-building opportunities are accessible to everyone.

Facebook parent Meta divides its AI teams, chief product officer explains change in internal memo: 'Our new structure aims to...'
Facebook parent Meta divides its AI teams, chief product officer explains change in internal memo: 'Our new structure aims to...'

Time of India

time28-05-2025

  • Business
  • Time of India

Facebook parent Meta divides its AI teams, chief product officer explains change in internal memo: 'Our new structure aims to...'

Meta has reportedly reorganising its artificial intelligence (AI) teams to expedite the development and deployment of new products and features. This internal restructuring, announced in a memo by chief product officer Chris Cox, aims to enhance the company's competitiveness in the evolving AI space, where it faces significant rivalry from entities like ChatGPt-maker OpenAI, Google and Microsoft. Tired of too many ads? go ad free now According to a report by Axios, Cox said in an internal memo on Tuesday (May 27) detailed the new organizational framework. As per the memo, there will be two distinct units: an AI Products team and an AGI Foundations unit. While Connor Hayes will lead the AI Products team, the AGI Foundations unit will be co-led by Ahmad Al-Dahle and Amir Frenkel. "Our new structure aims to give each org more ownership while minimizing (but making explicit) team dependencies," Cox stated. This also marks a similar effort to a previous AI team reshuffle conducted by Meta in 2023, which also aimed to expedite development. How new AI teams at Meta will work The AI Products team at Meta will be responsible for the Meta AI assistant , Meta's AI Studio, and the integration of AI features across core Meta platforms including Facebook, Instagram and WhatsApp. The AGI Foundations unit's will work on a range of underlying technologies, including the company's Llama models, alongside initiatives to enhance AI capabilities in reasoning, multimedia and voice. Meta's AI research unit, known as FAIR (Fundamental AI Research), will reportedly maintain its independent status outside this new structure. However, a specific team within FAIR focusing on multimedia will transition to the new AGI Foundations team. Company executives confirmed to the publication that no departures or job cuts are associated with these changes. Some leaders from other divisions of Meta have been integrated into the new AI structure. This call will steal your money: "Family Scam" working & how to protect yourself!

Exclusive: Meta shuffles AI team to compete with OpenAI and Google
Exclusive: Meta shuffles AI team to compete with OpenAI and Google

Axios

time27-05-2025

  • Business
  • Axios

Exclusive: Meta shuffles AI team to compete with OpenAI and Google

Meta is restructuring its AI teams to speed up the rollout of new products and features, Axios has learned. Why it matters: Meta faces stiff competition in the AI race, including from OpenAI and Google as well as Chinese rivals such as TikTok parent ByteDance. Driving the news: In an internal memo sent Tuesday and seen by Axios, chief product officer Chris Cox laid out the new structure, which will see efforts divided into two teams: an AI products team, headed by Connor Hayes, and an AGI Foundations unit, co-led by Ahmad Al-Dahle and Amir Frenkel. The AI products team will be responsible for the Meta AI assistant, Meta's AI Studio and AI features within Facebook, Instagram and WhatsApp. The AGI Foundations unit will cover a range of technologies, including the company's Llama models, as well as efforts to improve capabilities in reasoning, multimedia and voice. The company's AI research unit, known as FAIR (Fundamental AI Research), remains separate from the new organizational structure, though one specific team working on multimedia is moving to the new AGI Foundations team. Between the lines: No executives are leaving as part of the changes, nor are any jobs being cut, though the company has moved in some leaders from other parts of the company. Meta hopes that splitting a single large organization into smaller teams will speed product development and give the company more flexibility as it adds additional technical leaders. The company is also seeing key talent depart, including to French rival Mistral, as reported by Business Insider. What they're saying: "Our new structure aims to give each org more ownership while minimizing (but making explicit) team dependencies," Cox said in the memo.

Moment migrant dinghy trying to cross English Channel diverts Dunkirk flotilla of 'Little Ships' as they commemorated legendary WWII evacuation
Moment migrant dinghy trying to cross English Channel diverts Dunkirk flotilla of 'Little Ships' as they commemorated legendary WWII evacuation

Daily Mail​

time22-05-2025

  • General
  • Daily Mail​

Moment migrant dinghy trying to cross English Channel diverts Dunkirk flotilla of 'Little Ships' as they commemorated legendary WWII evacuation

This is the moment a migrant dinghy forced a flotilla of 'Little Ships' commemorating the legendary Dunkirk evacuation to make way - flanked by a French navy boat. Commemorations of the 85th anniversary of the heroic effort to save British, French and other Allied soldiers from incoming German troops kicked off yesterday - but the pleasure cruisers and speedboats had to make way for an unlikely guest. Their 45-mile trip across the English Channel, recreating the noble effort of May 1940, was disrupted by a demand from Border Force and the French navy to create a one-mile exclusion zone through which a migrant boat could ass. Images show the dinghy, packed with people who appeared to be wearing lifejackets, as they were closely followed by a smaller craft and what appeared to be a hulking Loire-class French navy vessel. Chris Cox, coordinator of the flotilla event commemorating Operation Dynamo, said of the unexpected interruption: 'There was a migrant boat in the water that was being covered by a French naval vessel. As is good proportion, we steered clear and let the authorities look after it. 'For the people in the small boat, they have never done this before and they don't know what to expect. The last thing you want them to do is to try and make for a pleasure boat or Dunkirk Little Ship, which would not be good.' A total of 13 boats carrying 825 migrants made the treacherous journey across the Channel on May 21 - with at least two people dying on one of the crossings. The Operation Dynamo recreation, however, was unhindered on its journey from Ramsgate to Dunkirk, save for the slight interruption from border officials and the French. Mr Cox added that it had been a 'perfect day' for a journey that 'couldn't have been smoother'. He added: 'Churchill asked the people to pray for calm conditions, and I think somebody must have been praying this week for us.' The fleet of 66 vessels set sail from Ramsgate, Kent, at 6am on Wednesday before happening on the new arrivals. The Telegraph reported that sailors were told in a maritime frequency message: 'There is a (French) warship on our head with a migrant (boat) close by. And we've been requested to give one nautical mile distance from that vessel, over.' A French-accented voice, believed to be from the French naval vessel Oyapock, then replied: 'Thank you, sir. Thank you very much.' One observer later said: 'It is one of the most important days in history and they are shoving them out of the way.' The Association of Dunkirk Little Ships organised the flotilla - depicted in Christopher Nolan's epic 2017 film, to ensure 'the legacy of the Dunkirk little ships continues to inspire future generations'. It came as two migrants, believed to be a woman and a child, tragically died in the Channel, as the total to have reached Britain since Labour came to power passed 36,000. The dead were pulled from waters off the Calais coast by the French navy after an overloaded dinghy got into difficulties. French officials said most of the rest of the migrants aboard the inflatable refused rescue and carried on to UK. Refugee charity Utopia 56 said it alerted emergency services to the tragedy, writing on X: '"The boat is broken, two people are dead." This is the information we received during a distress call in the English Channel this morning.' The Home Office confirmed there were 825 arrivals on Wednesday, bringing the total since the general election to 36,811, and a year-on-year increase of 37 per cent. The total includes 13,569 since the start of this year. The latest tragedy unfolded in the early hours, a spokesman for France's Maritime prefecture said. The dinghy which had 80 people aboard was designed to carry no more than 20, it is understood. The unidentified migrants were 'pulled out by a Navy vessel' and sailors 'performed first aid on the two victims, but they were soon declared dead'. Ten other passengers requested rescue, while about 70 others asked to remain aboard the inflatable which continued its journey towards the English coast. A French Navy boat and a helicopter with a medical team on board provided emergency cover. Prosecutors in France have started an enquiry into the fatalities, while judicial police are searching for people smugglers who arranged the crossing. In total, some 17 people have perished on small boats so far this year. A woman died on Sunday night after a dinghy broke up off the French coast. Last year saw a record 78 deaths.

I attended LlamaCon, Meta's first event for AI developers. It was 'kinda mid.'
I attended LlamaCon, Meta's first event for AI developers. It was 'kinda mid.'

Business Insider

time02-05-2025

  • Business
  • Business Insider

I attended LlamaCon, Meta's first event for AI developers. It was 'kinda mid.'

On the manicured lawns outside Building 21 on Meta 's sprawling Menlo Park headquarters, live llamas meandered with languid indifference, drawing clusters of developers who momentarily abandoned technical discussions for selfies with the stoic, woolly ambassadors of Meta's family of large language models. Inside Building 21, I shivered. The cavernous auditorium's air conditioning was cranked up high. Mood lighting bathed the space in Meta's signature blue shade, and dance music blasted from speakers, lending a nightclub ambiance to the event that clashed oddly with the earnest, tech-focused agenda. "Rise and shine!" a Meta PR person chirped as I took a seat. This was LlamaCon, Meta's first-ever conference for AI developers. Its timing felt oddly defensive. Earlier this year, DeepSeek, an open-source AI model from China that delivered groundbreaking performance with computational efficiency, had much of Silicon Valley, including Meta's AI division, panicked. Around the same time, Meta announced that it would spend $65 billion in 2025 to build out AI infrastructure. Weeks after that, the company released Llama 4, the latest version of its LLM family. Mark Zuckerberg called it"the beginning of a new era for the Llama ecosystem." Almost immediately after, Meta was accused of artificially inflating Llama models' performance benchmarks, a claim that executives pushed back against. LlamaCon, I thought, was Meta's moment to reclaim trust and clarify its AI strategy. Onstage, Meta's Chief Product Officer Chris Cox framed the company's open source strategy as principled rather than reactive: "We were a startup once, too," he said in the keynote. "We built this place on open source." The subtext was clear: Meta wants developers to see Llama as their path to autonomy and flexibility in an increasingly closed AI ecosystem dominated by offerings from OpenAI, Microsoft, and Google. Llama knelt to competitors LlamaCon featured several announcements, including the launch of a new Llama API that Meta says will make it easy for developers to integrate its models using familiar tools and interfaces. Some tasks will be possible with just a few lines of code. Meta also announced partnerships with companies to make AI run faster; a security program with AT&T and others to fight AI-generated scams; and $1.5 million in grants to startups and universities around the world using Llama. Conspicuously absent, however, was what many developers had actually come hoping to see: a new reasoning model to compete with what has rapidly become table stakes in the AI industry, including in Chinese open-source alternatives like DeepSeek and Alibaba's Qwen. In a conversation with Databricks CEO Ali Ghodsi, Zuckerberg seemed to tacitly acknowledge these shortcomings. "Part of the value around open source is that you can mix and match," he said. "If another model, like DeepSeek, is better, or if Qwen is better at something, then, as developers, you have the ability to take the best parts of the intelligence from different models. This is part of how I think open source basically passes in quality all the closed source [models]…[It] feels like sort of an unstoppable force." Vineeth Sai Varikuntla, a developer working on medical AI applications, echoed this sentiment when I spoke with him after the keynote. "It would be exciting if they were beating Qwen and DeepSeek," he said. "I think they will come out with a model soon. But right now the model that they have should be on par—" he paused, reconsidering, "Qwen is ahead, way ahead of what they are doing in general use cases and reasoning." Missing model improvements The online reaction to LlamaCon reflected similar disappointment across developer communities. On Reddit's r/LocalLLaMA, the top post was titled "No new models in LlamaCon announced." Users compared Meta unfavorably to Qwen 3, which Alibaba strategically released just one day before Meta's event. "Good lord. Llama went from competitively good Open Source to just so far behind the race that I'm beginning to think Qwen and DeepSeek can't even see it in their rear view mirror anymore," wrote one user. Others debated whether Meta had planned to release a reasoning model but pulled back after seeing Qwen's performance. On Hacker News, a popular forum for developers and tech industry professionals, some criticized the event's focus on API services and partnerships rather than model improvements as "super shallow." And one user on Threads summed up the event simply as "kinda mid." When I asked Meta how they measured the success of the event, they declined to comment. "It did seem like a bit of a marketing push for Llama," Mahesh Sathiamoorthy, cofounder of Bespoke Labs, a Mountain View-based startup that creates AI tools for data curation and training LLMs, told me. "They wanted to cast a wider net and appeal to enterprises, but I think the technical community was looking for more substantial model improvements." Still, LlamaCon won praise from Wall Street analysts tracking the company's AI strategy. "LlamaCon was one giant flex of Meta's ambitions and successes with AI," Mike Proulx of Forrester told me. Jefferies analyst Brent Thill called Meta's announcement at the event"a big step forward" to becoming a "hyperscaler, a term referring to large cloud serve providers that offer computing resources and infrastructure to businesses. Some developers using Llama models were equally enthusiastic about the technology's benefits. For Yevhenii Petrenko of Tavus, which creates AI-powered conversational videos, Llama's speed was crucial. "We really care about very low latency, like very fast response, and Llama helps us use other LLMs," he told me after the event. Hanzla Ramey, CTO of WriteSea, an AI-powered career services platform that helps job seekers prepare résumés and practice interviews, highlighted Llama's cost-effectiveness: "For us, cost is huge," he told me. "We are a startup, so controlling expenses is really important. If we go with closed source, we can't process millions of jobs. No way." The future's form and function Toward the end of the day, Zuckerberg joined Microsoft CEO Satya Nadella onstage for a wide-ranging chat about AI's future. One comment stood out. Llama 4, Zuckerberg explained, had been designed around Meta's preferred infrastructure — the H100 GPU, which shaped its architecture and scale. But he acknowledged that "a lot of the open source community wants even smaller models." Developers "just need things in different shapes," he said. "To be able to basically take whatever intelligence you have from bigger models," he added, "and distill them into whatever form factor you want — to be able to run on your laptop, on your phone, on whatever the thing is…to me, this is one of the most important things," he said. It was a candid admission. For all the pageantry, LlamaCon wasn't a coronation. It was Meta still mid-pivot, trying to convince developers — and maybe itself — that it can build not just models, but momentum.​​​​​​​​​​​​​​​​

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store