Latest news with #IBMResearch
Yahoo
a day ago
- Business
- Yahoo
IBM Announces Quantum Milestone
Jerry Chow, IBM Fellow and Director of Quantum Systems at IBM Research, explains the company's latest milestone in its effort to build the world's first fault-tolerant quantum computer. Chow speaks with Caroline Hyde on "Bloomberg Tech."


Bloomberg
a day ago
- Business
- Bloomberg
IBM Announces Quantum Milestone
Jerry Chow, IBM Fellow and Director of Quantum Systems at IBM Research, explains the company's latest milestone in its effort to build the world's first fault-tolerant quantum computer. Chow speaks with Caroline Hyde on 'Bloomberg Tech.' (Source: Bloomberg)


Business Insider
13-05-2025
- Business
- Business Insider
IBM Announces 'Content Aware' Storage Systems for AI
Tech company International Business Machines (IBM) announced in a blog post that its IBM Research and IBM Storage business units are teaming up to make storage systems 'content aware,' which means that they will be able to automatically understand and process the data they hold. This shift could make it much easier for businesses to access and use their vast amounts of stored data, such as sales reports, manuals, and financial documents, that are currently trapped in formats AI can't easily use. Protect Your Portfolio Against Market Uncertainty Discover companies with rock-solid fundamentals in TipRanks' Smart Value Newsletter. Receive undervalued stocks, resilient to market uncertainty, delivered straight to your inbox. Today's common method, called retrieval-augmented generation (RAG), involves pulling data out of storage, converting it into searchable vectors, and storing it in a separate vector database. However, this creates challenges with security, data freshness, cost, and system complexity. To solve this, IBM is introducing content-aware storage (CAS), which combines data processing pipelines, vector databases, and compute systems directly into the storage platform. Instead of acting as passive data containers, these new systems actively help with AI tasks like extracting data and embedding it for use in search or generative AI applications. CAS also improves security and performance by integrating access controls and processing triggers within the storage system itself, so vectors stay updated automatically. For search tasks, IBM is building scalable systems that can process up to 10 billion vectors in real time with 90% accuracy. This allows businesses to find rare and critical documents quickly. What Is the Target Price for IBM? Turning to Wall Street, analysts have a Moderate Buy consensus rating on IBM stock based on eight Buys, five Holds, and two Sells assigned in the past three months, as indicated by the graphic below. Furthermore, the average IBM price target of $259.92 per share implies 3.1% upside potential.


Forbes
29-03-2025
- Science
- Forbes
Next Phase: Intuitive AI That Attempts To Mimic The Human Psyche
Next frontier: automate intuition? getty Can artificial intelligence eventually mimic human intuition? And is that a good thing? Intuition has fueled many a business or personal life decision, and there is plenty of evidence to suggest that it's a fairly powerful and accurate tool. It taps into and selects from a vast wellspring of information in one's brain. As this recent podcast with neuroscientist Joel Pearson illustrates, intuition involves more than just 'tapping into any unconscious information. It's the learned information. So when we go about our lives, our brains processing thousands of things, we're only conscious of a tiny bit of that. We have no idea what our brains processing most of the time.' Intuitive AI – which can sense and respond to many seen and unseen factors – may represent the next phase of the technology. With the advent of machine learning and generative AI, there's been excitement about its productivity potential. The next frontier of AI may be what Ruchir Puri, chief scientist at IBM Research and IBM Fellow, describes as 'emotional AI.' While "human intelligence encompasses multiple dimensions – IQ or intelligence quotient, EQ or emotional quotient, and RQ or relational quotient. So far, AI has primarily only mastered IQ.' 'EQ helps humans understand and manage emotions, while RQ shapes how we build relationships,' Puri explained. "These are the next frontiers for AI development – systems that recognize, interpret and respond to human emotions beyond just sentiment analysis.' Emotional AI may even "become one of the most significant cultural turning points of our time,' he continued. 'Machines capable of understanding, responding to and generating emotions will reshape how society and businesses functions, with AI working alongside humans in a profoundly integrated way.' The IQ of AI will definitely keep growing as well, and "we'll soon see AI with an IQ of 1,000,000, as described by Emmy Award-winning producer Ryan Elam, founder and CEO of LocalEyes Video Production. 'At some point, AI will reach a level of intelligence so far beyond human cognition that it will no longer be comprehensible to us,' Elam predicted. "A machine with an IQ of 1,000,000 wouldn't just solve problems faster; it would perceive and define reality differently. These ultra-intelligent AIs may discover scientific laws we don't even have the cognitive framework to understand, essentially operating as alien minds among us. The challenge won't be building them—it will be figuring out how to interpret their insights." Wrap this into a future in which "our most intimate signals -- heart rate, body temperature, microexpressions, and subtle voice shifts -- are openly accessible,' said Dr. Zulfikar Ramzan, chief technology officer at Point Wild. 'In this world, AI, once celebrated for mastering highly analytical domains like Chess, Go, and even protein folding, can elevate – or wreak havoc upon -- the concept of emotional intelligence.' Most of the required technology already exists, Ramzan continued. 'High-resolution and high-frame-rate cameras, remote photoplethysmography, thermal imaging, radar-based skin conductivity sensing, and sensitive microphones can capture signals that that we once thought private: real-time pupil size, subtle color changes in skin caused by blood flow, microexpressions, skin temperature, sweat gland activity from a distance, voice prosody.' AI can merge these data streams, 'and analyze video, images, and speech to transform ostensibly hidden signals into a cogent narrative about the inner workings of the people around us. We can literally read the room.' Ramzan illustrates how this could work in business settings. 'Imagine negotiating a deal when AI notes your counterpart's pupils widen at a specific price point -- signaling non-verbal interest that could pivot the conversation,' he said. 'Picture delivering a presentation, but getting instant feedback on audience engagement. Suddenly, those who persistently struggle to interpret non-verbal cues are on nearly equal footing to the most preternaturally gifted empathetic, charismatic social chameleons.' Getting to more intuitive or emotional AI requires a more fluid user interface – to the point in which people do not realize they're still talking to machines – but, hopefully, will still be aware they are. 'Too often, AI impresses in carefully curated demos or cherry-picked case studies, but struggles in real-world use," said Anastasia Georgievskaya, founder and CEO of "People end up spending 15 to 20 minutes trying to make it work or even an hour refining prompts just to get a decent result.' This frustration, she continued, "comes from a fundamental limitation: we're trying to communicate highly complex, contextual thoughts through simple text prompts, which just isn't efficient. Our thoughts are richer, more layered than what we can type out, and that gap between what we mean and what AI understands leads to underwhelming results.' Once we move beyond prompting and text commands, 'the real innovation will happen—moving beyond text commands,' said Georgievskaya. 'I see a future where we can leverage neurotechnology to express intent without language. AI that doesn't wait for us to spell things out, but instead picks up on our thoughts, emotions, and context directly, making interactions far more intuitive.' 'Take skincare recommendations. Instead of typing, 'I want something lightweight with vitamin C,' AI could already know,' said Georgievskaya. 'It could sense your emotional reactions, subconscious preferences, even remember which influencer's review you engaged with. It might recognize that you're drawn to certain textures or packaging – without you needing to say a word. Within a few years, AI may no longer ask what we want – it will simply understand.'