logo
Qlik Widens Interoperable Data Platform With Open Lakehouse

Qlik Widens Interoperable Data Platform With Open Lakehouse

Forbes15-05-2025

The people of Inle Lake (called Intha), some 70,000 of them, live in four cities bordering the lake, ... More in numerous small villages along the lake's shores, and on the lake itself. (Photo by Neil Thomas/Corbis via Getty Images)
Software comes in builds. When source code is compiled and combined with its associated libraries into an executable format, a build is ready to run, in basic terms. The construction analogy here extends directly to the data architecture that the code is composed of and draws upon. Because data architectures today are as diverse as the software application types above them, data integration specialists now have to work across complex data landscapes and remain alert to subsidence, fragilities and leakage.
These software and data construct realities drive us towards a point where data integration, data quality control and data analytics start to blend. Key players in this market include Informatica, SnapLogic, Rivery, Boomi, Fivetran, Tibco, Oracle with its Data Integrator service and Talend, the latter now being part of Qlik.
Key differentiators in the data analytics and integration space generally manifest themselves in terms of how complex the platform is to set up and install (Informatica is weighty, but commensurately complex), how flexible the tools are from a customization perspective (Fivetran is fully managed, but less flexible as a result), how natively aligned the service is to the environment it has to run in (no surprise, Microsoft Azure Data Factory is native with Microsoft ecosystem technologies) and how far the data integration and analytics services on offer can be used by less technical businesspeople.
As this vast marketplace also straddles business intelligence, there are wider reputable forces at play here from firms including Salesforce's Tableau, Microsoft's Power BI, Google's Looker and ThoughtSpot for its easy-to-use natural language data vizualizations. Where one vendor will tell us its dashboards are simplicity itself, another will stress how comprehensively end-to-end its technology proposition is. Generally cloud-based and often with a good open source heritage, the data integration, data quality and data analytics space is a noisy but quite happy place.
Looking specifically at Qlik, the company is known for its 'associative' data engine, which offers freeform data analytics that highlight relationships between data sets in non-linear directions without the need for predefined queries. It also offers real-time data pipelines and data analytics dashboards. The organizations's central product set includes Qlik Sense, an AI-fuelled data analytics platform service with interactive dashboards that also offers 'guided analytics' to align users towards a standard business process or workflow. QlikView is a business intelligence service with dynamic dashboards and reports - and Qlik Data Integration (the clue is in the name) for data integration and data quality controls with a web-based user interface that support both on-premises and cloud deployments.
Qlik champions end-to-end data capabilities, that means the tools here extend from the raw data ingestion stage all the way through to so-called 'actionable insights' (that term data analytics vendors swoon over), which are now underpinned and augmented by a new set of AI services. The company's AI-enhanced analytics and self-service AI services enable users to build customized AI models, which help identify key drivers and trends in their data.
Not as historically dominant in open source code community contribution involvement as some others (although a keen advocate of open data and open APIs, with news on open source Apache Iceberg updates in the wings) Qlik has been called out for its pricing structure complexity. From a wider perspective, the company's associative engine and its more unified approach to both data analytics and data integration (plus its self-service analytics capabilities) are probably the factors that set it apart.
'Qlik's analytics-centric origins and methodical, iterative portfolio development has made it the BI platform for data geeks and data scientists alike, but thankfully hasn't made it overly conservative. The company has accelerated its product strategy in the past four years, adding data quality with the Talend acquisition and 'AI for BI' with the AutoML acquisition (originally Big Squid). These, plus modernization capabilities for customers who need it - Qlik Sense for accessibility to broader user bases, Qlik Cloud for an as-a-Service model… and the tools to migrate to them, make Qlik worth watching in today's increasingly data-driven and visualization-driven, AI-empowered enterprise market,' explained Guy Currier, a technology analyst at the Futurum Group.
Looking to extend its data platform proposition right now, Qlik Open Lakehouse is a new and fully-managed Apache Iceberg solution built into Qlik Talend Cloud. As explained here, a data lakehouse combines the structure, management and querying capabilities of a data warehouse, with the low-cost benefits of a data lake. Apache Iceberg is an open source format technology for managing large datasets in data lakes with data consistency. Designed for enterprises under pressure to scale faster, the company says its Qlik Open Lakehouse delivers real-time ingestion, automated optimization and multi-engine interoperability.
'Performance and cost should no longer be a tradeoff in modern data architectures,' said Mike Capone, CEO of Qlik. 'With Qlik Open Lakehouse, enterprises gain real-time scale, full control over their data and the freedom to choose the tools that work best for them. We built this to meet the demands of AI and analytics at enterprise scale and without compromise.'
Capone has detailed the company's progression when talking to press and analysts this month. He explained that for many years, Qlik has been known for its visual data analytics services and indeed, the organization still gets customer wins on that basis.
'But a lot has happened in recent times and the conversation with users has really come around to gravitate on data with a more concerted focus. With data quality [spanning everything from deduplication to analysis tools to validate the worth of a team's data model] being an essential part of that conversation - and the old adage of garbage in, garbage out still very much holding true - the icing on the cake for us was the Talend acquisition [for its data integration, quality and governance capabilities] because customers cleary found it really expensive to cobble all the parts of their data estate together. Now we can say that all the component parts of our own technology proposition come together with precision-engineered fit and performance characteristics better than ever before,' said Capone.
Keen to stress the need for rationalized application of technologies so that the right tool is used for the appropriate job, Capone says that the Qlik platform enables users to custom align services for specific tasks i.e. software engineering and data management teams need not use a super-expensive compute function when the use case is suited to a more lightweight set of functions. He also notes that the company's application of agentic AI technology pervades 'throughout the entire Qlik platform'; this means that not only can teams use natural language queries to perform business intelligence and business integration tasks, they can also ask questions in natural language related to data quality to ensure an organization's data model's veracity, timeliness and relevance is also on target.
But does he really mean any data tool openness in a way that enables customers the 'freedom to choose the tools' that work best for them?
'Absolutely. If a company wants to use some Tableau, some Informatica and some Tibco, then we think they should be able to work with all those toolsets and also deploy with us at whatever level works for the business to be most successful. Obviously I'm going to tell you that those customers will naturally gravitate to use more Qlik as they experience our functionality and cost-performance advantage without being constrained by vendor lock-in, but that's how good technology should work,' underlined Capone.
Freedom to choose your own big data tools and analytics engines sounds appealing, but why do organizations need this scope and does it just introduce complexity from a management perspective? David Navarro, data domain architect at Toyota Motor Europe, thinks this is 'development worth keenly watching' right now. This is because large corporations like his need interoperability between different (often rather diverse) business units and between different partners, each managing its own technology stack with different data architects, different data topographies and all with their own data sovereignty stipulations.
'Apache Iceberg is emerging as the key to zero-copy data sharing across vendor-independent lakehouses and Qlik's commitment to delivering performance and control in these complex, dynamic landscapes is precisely what the industry requires,' said Navarro, when asked to comment on this recent product news.
Qlik tells us that all these developments are an evolution of modern data architectures in this time of AI adoption. It's a period where the company says that the cost and rigidity of traditional data warehouses have become unsustainable. Qlik Open Lakehouse offers a different path i.e. it is a fully managed lakehouse architecture powered by Apache Iceberg to offer 2.5x–5x faster query performance and up to 50% lower infrastructure costs. The company says that it achieves this while maintaining full compatibility with the most widely used analytics and machine learning engines. Qlik Open Lakehouse is built for scale, flexibility and performance… and it combines real-time ingestion, intelligent optimization and ecosystem interoperability in a single, fully managed platform.
Capabilities here include real-time ingestion at enterprise scale, so (for example) a customer could ingest millions of records per second from hundreds of sources (e.g. cloud apps, SaaS, ERP suites and mainframes and plug that data directly into Iceberg tables with low latency and high throughput.
Qlik's Adaptive Iceberg Optimizer handles compaction, clustering and 'pruning' (removing irrelevant, redundant and often low-value data from a dataset) automatically, with no tuning required. Users can access data in Iceberg tables using a variety of Iceberg-compatible engines without replatforming or reprocessing, including Snowflake, Amazon Athena, Apache Spark, Trino and SageMaker.
'Although clearly fairly proficient in across a number of disciplines including data integration, analytics and data quality controls, one of the challenges of Qlik and similar platforms is the limited scope for truly advanced analytics capabilities," said Jerry Yurchisin, senior data science strategist at Gurobi, a company known for its mathematical optimization decision intelligence technology. 'This can mean that users have to take on extra configuration responsibilities or make use of an extended set of third-party tools. Data scientists, programmers, analysts and others really want one place to do all of their work, so it's important for all platforms to move in that direction. This starts with data integrity, visualization and all parts of the analytics spectrum - not just descriptive and predictive, but also prescriptive - which is arguably the holy grail for data management at this level.'
Director of research, analytics and data at ISG Software Research, Matt Aslett spends a lot of time analyzing data lakehouse architectures in a variety of cloud computing deployment scenarios. He suggests that products like Qlik Open Lakehouse, which use open standards such as Apache Iceberg, are 'well-positioned' to meet the growing demand for real-time data access and multi-engine interoperability.
'This enables enterprises to harness the full potential of their data for AI and analytics initiatives," said Aslett. 'As AI workloads demand faster access to broader, fresher datasets, open formats like Apache Iceberg are becoming the new foundation. Qlik Open Lakehouse responds to this shift by making it effortless to build and manage Iceberg-based architectures, without the need for custom code or pipeline babysitting. It also runs within the customer's own AWS environment, ensuring data privacy, cost control and full operational visibility.
In line with what currently appears to drive every single enterprise technology vendor's roadmap bar none, Qlik has also tabled new agentic AI functions in its platform this year. Here we find a conversational interface designed to give users an evenue to 'interact naturally' with data. If none of us can ever claim to have had a real world natural data interaction, in this case the term refers to data exploration with the Qlik engine to uncover indexed relationships across data. The agentic functions on offer work across Qlik Cloud platform and so offer data integration, data quality and analytics. It's all about giving businesspeople a more intuitive visibility into data analytics for decision making.
Also new are an expanded set of capabilities in Qlik Cloud Analytics. These include functions to detect anomalies, forecast complex trends, prepare data faster and take action through what the company calls 'embedded decision workflows' today.
'While organizations continue to invest heavily in AI and data, most still struggle to turn insight into impact. Dashboards pile up, but real-time execution remains elusive. Only 26% of enterprises have deployed AI at scale and fewer still have embedded it into operational workflows. The problem isn't access to static intelligence, it's the ability to act on it. Dashboards aren't decision engines and predictive models alone won't prevent risk or drive outcomes. What businesses need is intelligence that anticipates, explains, and enables action without added tools, delays, or friction. Discovery agent, multivariate time series forecasting, write table, and table recipe work in concert to solve a singular problem: how to move from fragmented insight to seamless execution, at scale,' said the company, in a product statement that promises to target 'critical enterprise bottlenecks' and close the gap between data and decisions.
The data integration, data quality, data analytics and AI-powered data services market continues to expand, but we can perhaps pick up on some defining trends here.
An alignment towards essentially open source technologies, protocols and standards is key, especially in a world of open cloud-native Kubernetes. Provision of self-service functionalities is also fundamental, whether they manifest themselves as developer self-service tools or as 'citizen user' abstractions that allow businesspeople to use deep tech… or both. A direct embrace of AI-driven services is, of course, a prerequisite now, as is the ability to provide more unified technology services (all firms have too many enterprise apps… and they know it) that work across as wide an end-to-end transept as is physically and technically possible.
Qlik is getting a lot of that right, but no single vendor in this space can get everything absolutely comprehensively perfected it seems, so there will always be a need for data integration, even across and between the data integration space.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

I was laid off from Microsoft after 23 years, and I'm still going into the office. I feel responsible for my team and customers.
I was laid off from Microsoft after 23 years, and I'm still going into the office. I feel responsible for my team and customers.

Business Insider

time9 hours ago

  • Business Insider

I was laid off from Microsoft after 23 years, and I'm still going into the office. I feel responsible for my team and customers.

This as-told-to essay is based on a conversation with Freddy Kristiansen, a 59-year-old former Principal Product Manager at Microsoft's Denmark office who was laid off in May 2025. Business Insider has verified Kristiansen's employment. The following has been edited for length and clarity. A couple of weeks ago, after 23 years at Microsoft, I was laid off. Yet here I am, back in the office. It might sound strange to show up at the office after being let go, but I still feel committed to the products, the people using them, and my colleagues. I was laid off in May, and per Danish law as an employee of over nine years, I have a six-month notice period. I've been relieved of my duties, but I am still officially an employee until the end of November. I'm also entitled to three months of severance pay after my notice. I didn't plan to stay at Microsoft for two decades I was originally hired by Navision in 2002. I saw it as a job I'd stay in for a year or two, but shortly after I joined, Microsoft acquired Navision. From then on, I was a Microsoft employee. That's when I thought, "Maybe this could actually be something long-term." Indeed, it ended up being my professional home for the next 23 years. Over the years, I have held a variety of roles, from group program management to technical evangelist. Although I never had an official developer title, I have been developing products throughout. My last major project was AL-Go for GitHub — a tool that helps our partners use DevOps, a software development approach, in their daily work without needing to understand the complex technical details. I didn't expect to feel relieved when I got laid off I've found the work fulfilling, but around five years ago, I started dreaming of my own business. During the last round of Microsoft layoffs in 2023, I submitted an anonymous question during an all-hands asking if they would consider voluntary redundancies. If the option came up in the future, I might volunteer. It never did. One morning in May this year, I got an invite to a one-on-one meeting with my manager. I said to my wife, "This is it. I'm pretty sure I'm going to be laid off." I thought I might feel upset, but, in reality, it was kind of a relief. Some of my colleagues were devastated. They are worried about what the future might hold. But I'm nearing 60. For the past decade, I've worked very hard and put in long hours. However, I'm at the stage of life where I'm no longer interested in working 60-hour weeks. It felt like the right time to finally pursue my long-overdue dream of doing work on my own terms. During that layoff call with my manager and HR, I wasn't sad; I was already thinking about what I wanted to do next. I believe this new chapter will be good for me. I'll be able to take more time for myself, and hopefully I'll be less stressed as I can set my own hours. Starting a business is my silver lining My focus is now on figuring out a business plan that will allow me to deliver the most value to partners and customers in the least amount of time. I plan to offer CTO services, project management, and maybe even some motivational speaking, while squeezing in travel and getting back into a regular exercise routine. Since the layoffs, I've been reminding myself that every cloud has a silver lining. In Danish, we say, "Nothing is so bad that it isn't good for something." In this case, the upside was the severance package. If I'd quit, I'd have received nothing. Because I was laid off after so many years of service, I was entitled to at least nine months of pay. I can use this package as a foundation to build toward my future plans. I still am going into the office for talks and office hours I still have an office access card and my company laptop, at the latest until December when I'm officially terminated. In the meantime, I'm still keen to be helpful. I went into the office today because we had a call with our AL-Go for GitHub product users. Over the years, I introduced this tool to many customers and partners at conferences and in blog posts. I feel a responsibility not only to maintain the product but also to reassure them that they are in safe hands. I'm also in touch with my former team. If they need my help, I'll answer questions, share guidance, or whatever else helps. There's no reason to stop doing that. Next month, I'll be hosting a session for current staff — a kind of motivational talk about my career at Microsoft and the good, bad, and not-so-fun decisions I made. One of those decisions was working my butt off for years. Nobody told me to spend 20 hours on weekends or to work as hard as I did, but I did it because it felt like the right thing to do. I did it because I genuinely felt a connection to our partners, our customers, and my colleagues. And, honestly, I still do.

If You Get This Message From Apple Or Google, It's An Attack
If You Get This Message From Apple Or Google, It's An Attack

Forbes

time9 hours ago

  • Forbes

If You Get This Message From Apple Or Google, It's An Attack

Delete all these messages. There's nothing a cyber criminal likes more than highly publicized events, sudden fear and a sense or urgency. And so last week's headlines that 16 billion passwords leaked in the 'largest ever data breach' hit the jackpot. That this 'opened access' to Apple and Google accounts, the most prized of all, just made it all the sweeter. The fact there's no new data breach impacting Google or Apple or Microsoft or Facebook is beside the point. This is an amalgamation of various breaches, collecting data from multiple sources including infostealers on PCs. But users reading the headlines will not realize and will understandably panic. This highlights the weakness in using passwords to secure accounts. Despite what you've read, the answer is not to reset or change all your passwords. It's to enable two-factor authentication on all your key accounts — especially the likes of Apple, Google, Microsoft, Facebook and Amazon. Better still, switch to passkeys where you can. But many everyday users are now at risk from attacks, whether or not their user names and passwords were in any of those breached datasets. Attackers will now send out emails pretending to be from Apple, Google or other brands, warning of the breach and linking to the public headlines and password reset advice. And those emails or texts will helpfully include a password reset link or a helpline number to call. We've already seen multiple attacks on Apple and Google users, with fake support emails or calls or texts warning that accounts are compromised and passwords need to be reset. These recent headlines are a surprise gift to those attackers. And so, a timely reminder that no major tech brand — Google, Apple, Microsoft and Facebook included — will ever reach out to you about an account security problem or to reset a password. If you receive any such message or call, it's an attack. Period. Google has asked me in the past to 'please reiterate to your readers that Google will not contact you to reset your password or troubleshoot account issues.' The same is true for all those others. It never happens. As the FBI says, 'legitimate customer, security, or tech support companies will not initiate unsolicited contact with individuals." Even if a message is so plausible that you can't ignore it, you must still delete it and access your account using the usual means. Online or using your app. If there's a password issue you'll be directed to a reset option. There won't be. Similarly, if you receive a call or a message to call back, do not respond. Access your account as normal. Google and Apple account details are the most valuable, granting access to many apps and services and the mobile phones that control our lives. But treat any messages from Microsoft or Facebook or any other brand in the wake of this 'breach' the same way. The key advice — to add 2FA or passkeys — will protect you even if a breach is new. The final advice is to avoid SMS 2FA — use another method if you can.

Microsoft Confirms Windows 11 Automatic Deletions: Take Action Now To Protect Yourself
Microsoft Confirms Windows 11 Automatic Deletions: Take Action Now To Protect Yourself

Forbes

time11 hours ago

  • Forbes

Microsoft Confirms Windows 11 Automatic Deletions: Take Action Now To Protect Yourself

Microsoft's Windows 11 creates System Restore points, that is, snapshots of your PC's system files, settings and registry. But those points expire and are automatically deleted after 60 days, Microsoft has now confirmed. Users can protect themselves by creating regular System Restore points. 'With System Restore you can revert your PC's state to a previous point in time. By using System Restore, you can undo these changes without affecting your personal files,' Microsoft says. Windows 11 Which is great, but those restore points don't last forever, so it's important to know exactly how long they are there for. Previous documentation suggested that on Windows 10, restore points could last as long as 90 days. Windows Latest reports that 'After Windows 11's release in 2021, the retention period has been anywhere between 10 and 90 days (mostly 10 days),' it says. Ten days really isn't long, but there's good news. In a new support document relating to the June 10 update, Microsoft is a bit more specific. 'After installing the June 2025 Windows security update, Windows 11, version 24H2 will retain system restore points for up to 60 days. To apply a restore point, select Open System Restore. Restore points older than 60 days are not available. This 60-day limit will also apply to future versions of Windows 11, version 24H2,' it says. In other words, Microsoft has confirmed that Windows 11 System Restore points will be deleted after 60 days, so you need to periodically create restore points. That's not as good as 90 days, obviously, but way better than 10 days. 'This will give you multiple snapshots, but Windows will still delete the oldest ones once they exceed the retention window (now 60 days on Windows 11 24H2 by default),' says Windows Latest. To create your own System Restore point, as Windows Latest explains, you open Start and search for 'Create a restore point,' which will open System Protection tab in System Properties. Next, under Protection Settings, check that one of the partitions where you're going to put the backup is protected. Choose that partition and Configure to turn on protection. Then, click Create and follow the onscreen instructions. This will last for 60 days. Now that the deletion date is clear, it seems like creating one every few weeks is good practice.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store