logo
#

Latest news with #interoperability

This protocol aims to unify the fragmentation of Web3 and unlock Bitcoin DeFi for every chain
This protocol aims to unify the fragmentation of Web3 and unlock Bitcoin DeFi for every chain

Crypto Insight

time3 days ago

  • Business
  • Crypto Insight

This protocol aims to unify the fragmentation of Web3 and unlock Bitcoin DeFi for every chain

Communication protocol GVNR prepares to launch GVNR following the successful deployment of its proof-of-concept applications that unify Web3. Navigating crypto in 2025 feels like flying with three different passports: one for Ethereum, another for Bitcoin and a third for the favorite side-chain. Every hop between wallets introduces extra clicks, swap fees, fresh attack surfaces and, most worryingly, the risk of unexpected tax events. This creates a fragmented, clunky experience that frustrates users and keeps newcomers out. GVNR, a foundational general message passing layer for Web3, believes detouring is a relic of the early internet era. After 18 months of building, the team is rolling out a universal routing layer that lets any smart contract communicate with any other chain as naturally as web pages link to each other. The project's goal is simple: treat every blockchain as one cohesive runtime. Its protocol passes signed messages among chains so developers don't need wrapped assets, bridges or custodial middlemen. Instead of constantly switching networks and wallets, users just make one move, and GVNR handles the rest behind the scenes. Avoiding bridges and wrapping tokens can eliminate the tax risks; in fact, few understand that bridging assets can trigger a tax liability. Putting theory into practice Putting this vision into action, GVNR has already launched three live proof-of-concept apps that show what seamless interoperability can look like: GVNR Portfolio: A dashboard where users and their AI agents can view and control tokens scattered across every connected chain from a single interface. JustPay: A checkout layer that unlocks $500 billion of asset value, letting users spend any token on any chain to pay an invoice on a different chain. For example, an Arbitrum bill can be settled with Bitcoin, or a Solana mint can be covered with USDC on Polygon in a single click. JustSwap: An aggregation layer for decentralized exchanges (DEX) that lets traders swap tokens on any chain for any other asset across connected ecosystems, and a unique swap and send function so users can gas new wallets with a single action. GVNR has already processed more than $450,000 onchain, with over 26,000 users executing more than 60,000 swaps, minting over 35,000 non-fungible tokens (NFTs) across ten chains and logging over 143,000 transactions in total. Each interaction is a live demonstration that GVNR messages can shepherd value anywhere liquidity is needed. The engine of the ecosystem With its core technology demonstrated, the project is now centered on the launch of its native token, GVNR. The token is designed as a multifaceted utility asset that powers the entire network. Beyond its role in the protocol's decentralized governance, it will also be used for staking and payments. A key aspect of the token's design is its planned integration with the growing network of AI agents, which will be able to use GVNR to complete onchain actions. With a capped supply of 20 million, the token is now available to the public through a sale on Republic. The GVNR token empowers holders with governance rights through the GVNR DAO. Unlike many projects, there is no entity with 'labs' in its name that owns the intellectual property. GVNR's structure ensures that the decentralized autonomous organization's (DAO) sole purpose is to steer the protocol and drive value back to the token. This is reinforced by a deflationary furnace mechanism, which uses network fees to permanently reduce the token supply, aligning network growth directly with holder value and serving the ultimate vision of mobilizing a new era of crosschain liquidity. As foundational routing layers like GVNR mature, they begin to abstract away the complexity of the underlying blockchains. With such developments, the industry is gradually shifting from a collection of siloed networks toward a more unified landscape where digital value can move as freely as information, paving the way for a more intuitive and interconnected user experience. What's next? Following on, GVNR envisions a new permissionless era for Bitcoin. The team is building a permissionless Bitcoin DeFi loan product, named Diamond Hands. Other assets, such as ETH and SOL, have had access to loan products since DeFi began, but Bitcoin has been left behind, forced into wrapping, bridging and worse, centralized entities. Bridging and wrapping incur tax events, centralized entities risk default events; GVNR Diamond Hands will enable non-custodial native Bitcoin DeFi loans. Source:

NVIDIA (NasdaqGS:NVDA) Collaborates With Tech Soft 3D And Trend Micro For AI Solutions
NVIDIA (NasdaqGS:NVDA) Collaborates With Tech Soft 3D And Trend Micro For AI Solutions

Yahoo

time3 days ago

  • Business
  • Yahoo

NVIDIA (NasdaqGS:NVDA) Collaborates With Tech Soft 3D And Trend Micro For AI Solutions

NVIDIA recently announced a collaboration with Tech Soft 3D and a partnership with Dell Technologies and Trend Micro, focusing on enhancing interoperability and AI-powered cybersecurity solutions, respectively. These strategic moves likely supported the company's notable 23% price increase over the last quarter. Additional factors such as the company's Q1 earnings report, which revealed significant revenue and net income growth, might have also bolstered this trend, despite a broadly flat market. NVIDIA's proactive expansions in AI and digital innovation align with industry growth forecasts, contributing positively to its market performance. Every company has risks, and we've spotted 1 possible red flag for NVIDIA you should know about. The best AI stocks today may lie beyond giants like Nvidia and Microsoft. Find the next big opportunity with these 27 smaller AI-focused companies with strong growth potential through early-stage innovation in machine learning, automation, and data intelligence that could fund your retirement. The recent collaborations NVIDIA announced, focusing on enhancing AI-powered cybersecurity and interoperability solutions, could substantially impact the company's future revenue and earnings potential. These partnerships aim to expand NVIDIA's presence in the cybersecurity and AI sectors, aligning with trends that support growth in data center and AI workloads. The quarterly price increase of 23% is influenced by these strategic alliances, adding to the company's robust performance over the past five years, where total returns reached a very large percentage. Over this longer period, NVIDIA's shares exhibited phenomenal growth, outpacing many within the broader market. Over the past year, NVIDIA's returns contrasted with the broader US market, which saw a more modest 9.9% gain. Analysts anticipate these partnerships with Tech Soft 3D and Dell Technologies, combined with NVIDIA's expansion into the automotive sector through alliances with Toyota and Uber, will positively influence revenue and earnings forecasts. With revenue at US$148.52 billion and earnings at US$76.77 billion, the projected growth trends appear promising. As analysts predict future growth trajectories, the current share price indicates expectations of further price appreciation. Based on the consensus analyst price target of US$172.65, the share price reflects a discount, highlighting potential upside. This price movement demonstrates optimism around the anticipated financial performance, driven by NVIDIA's strategic initiatives and continued innovation across its key sectors. Our valuation report unveils the possibility NVIDIA's shares may be trading at a premium. This article by Simply Wall St is general in nature. We provide commentary based on historical data and analyst forecasts only using an unbiased methodology and our articles are not intended to be financial advice. It does not constitute a recommendation to buy or sell any stock, and does not take account of your objectives, or your financial situation. We aim to bring you long-term focused analysis driven by fundamental data. Note that our analysis may not factor in the latest price-sensitive company announcements or qualitative material. Simply Wall St has no position in any stocks mentioned. Companies discussed in this article include NasdaqGS:NVDA. This article was originally published by Simply Wall St. Have feedback on this article? Concerned about the content? with us directly. Alternatively, email editorial-team@

Is Now an Exciting Time for European Instant Payments Progress?
Is Now an Exciting Time for European Instant Payments Progress?

Finextra

time3 days ago

  • Business
  • Finextra

Is Now an Exciting Time for European Instant Payments Progress?

While attending EBAday 2025 in Paris, Sheri Brandon, Global Head of New Business, Worldline, joined the FinextraTV studio to talk about how instant payments have evolved over the last year. Defining the landscape as exciting within European payments, Brandon explained how progress is being made more significantly, especially within interoperability in the face of SEPA deadlines. On top of this, Brandon gave her predictions for future developments and how to stay ahead of rising fraud threats.

Cross-Border Fragmentation: To Innovate Without Risk, It Must Be Interoperable
Cross-Border Fragmentation: To Innovate Without Risk, It Must Be Interoperable

Finextra

time4 days ago

  • Business
  • Finextra

Cross-Border Fragmentation: To Innovate Without Risk, It Must Be Interoperable

Discussing interoperability in the face of fragmentation within cross-border operations, Susana Delgado, Global Head of Market Intelligence & Engagement, Swift joined the FinextraTV studio at EBAday 2025. Quoting a collaborative study between Swift and The Economist, she mentions how fragmentation could lead to heightened costs and lower global GDP. While acknowledging that diverse approaches can foster innovation, Delgado emphasized that interoperability is critical to managing the associated risks and ensuring sustainable progress.

The Data Tsunami In Today's Interoperable And AI World
The Data Tsunami In Today's Interoperable And AI World

Forbes

time5 days ago

  • Health
  • Forbes

The Data Tsunami In Today's Interoperable And AI World

David Lareau is CEO of Medicomp Systems, which makes medical data relevant, usable and actionable. getty Artificial intelligence (AI) systems have seemingly taken healthcare—and perhaps the world—by storm, showing great promise for revolutionizing care, research, workflows and more. As the industry evaluates what's AI hype versus true potential, one thing about AI has become clear: It generates a massive amount of data—which for healthcare creates both opportunities and challenges. Now that interoperability pipes are live and data is flowing between healthcare organizations, providers are managing a tsunami of incoming data. With AI and large language models (LLMs), it has become easier to generate even more data, which means the problem has just worsened. And unlike a regular tsunami, healthcare's giant wave of incoming data is not a one-time event. The data will continue to accumulate. What are healthcare organizations to do to keep from drowning in all the data? Despite early hopes that AI might solve data challenges, the bloom is now off that rose and users are realizing that AI needs to be a bit more reliable, accurate and trustworthy for critical healthcare tasks. Accurate clinical records are vital for patient care, reimbursement and operational efficiencies. When a physician creates a clinical note, they are responsible for ensuring the documentation is accurate and complete. With interoperability, we now have a wealth of additional clinical information coming from other clinicians, hospitals, labs, HIEs and other sources. While, on the surface, more information seems powerful, clinicians now need to figure out if the incoming data is accurate and what pieces are relevant for the patient in front of them. In other words, if clinicians are responsible for the incoming tsunami of data, how can they efficiently verify and manage it all? Consider the impact of AI-assisted documentation tools, which aim to optimize workflows, reduce documentation times and relieve physician burnout. Their output, however, is only as good as the data being fed into the systems. If a healthcare organization is building AI initiatives using data from a repository filled with errors or gaps, the resulting output will be flawed. Transparency is thus essential when it comes to AI data. When documentation is created through AI, healthcare providers need to understand the source of truth behind an assessment or recommendation so they can quickly identify potential errors that might impact patient care. Another limitation of conversational AI is the inability to create structured data. While these tools may be excellent at creating narrative text, they often fail to create the structured clinical data needed for analytics, regulatory compliance and quality metrics. AI-assisted documentation tools also rely heavily on summarization, which can be beneficial when trying to quickly make sense of a compilation of dozens of multiple-page medical records. LLMs, however, are trained to identify the next logical word, with very little reasoning behind them. To drive accuracy, these models should be trained on expertly curated clinical content using advanced algorithms. AI is also imperfect for coding, especially when derived from erroneous or incomplete documentation. Incorrect diagnosis and procedure codes can create downstream problems, including denied claims, inaccurate reimbursement and inappropriate follow-up patient care. Healthcare enterprises using AI to generate content that becomes part of a patient's medical record should consider the following: 1. Who is responsible for the final review and approval of AI-generated content that becomes part of a medical record? 2. Where in the workflow does this happen, and how are changes or corrections made and communicated to all members of the care team? 3. Organizations should consult with clinical users to obtain agreement on how information is presented, enabling those who sign off on care plans to view documentation and, if necessary, act upon the information presented to them. 4. The organization should implement a feedback loop so users can let system and workflow designers know of issues with AI-generated content and suggest refinements and improvements. To effectively manage today's tsunami of clinical data, especially in this era of interoperability and AI, clinicians need tools that can transform huge volumes of clinical data into a source of truth. This requires technologies that leverage trusted, curated clinical content to validate AI-generated outputs. To be useful for clinicians at the point of care, these tools should also sort and filter data and create structured, discrete information that is mapped to standard terminologies, value sets and reporting formats. Healthcare's data tsunami is not going away. However, by embracing tools that pair with AI outputs to improve data quality, clinicians can be empowered with rapid access to accurate and actionable data that enhances patient care. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store