Latest news with #Git


Business Upturn
5 days ago
- Science
- Business Upturn
Top 10 Essential Skills for Aspiring Data Scientists in 2025
Throughout the information age, businesses alongside governments recognise data functions as their most potent corporate resource. The rapid advancement of industries depends on data science methods, which currently drive consumer behavioural predictions and healthcare diagnostic improvement. Data scientist skills stand as an invaluable employment asset because the need for data professionals shows no signs of slowing down as we approach 2025. But what exactly does it take to succeed as a data scientist in today's competitive landscape? Whether you're a student, a working professional planning a career switch, or simply intrigued by the data boom, understanding how to become a data scientist starts with mastering the right blend of technical, analytical, and interpersonal skills. Every aspiring data scientist needs to learn these top ten important skills, which will guarantee success: Programming and Scripting Languages Programming stands as a key requirement for every aspiring data scientist. Data scientists invest most of their professional time in creating software code that processes data and generates statistical analyses, and builds machine learning models. Within the field of data science, Python and R stand as the dominant programming languages. The functionality of querying databases relies heavily on using SQL. Knowledge of Git as well as Jupyter Notebooks with basic software development principles helps enhance the work. Mastering programming serves as a fundamental requirement rather than a helpful addition. Programming serves as your fundamental tool to extract data from sources and refine it, and also enables model development. Mathematics and Statistics The fundamental basis of data science rests on mathematical structures. Data science requires extensive dependency on probability to function with linear algebra and calculus, as it helps discover patterns by running mathematical algorithms. Statistics help in: Hypothesis testing Sampling Regression analysis Confidence intervals Viable insights and valid machine learning models become impossible to obtain without proper tools. Data Wrangling and Cleaning Real-world data arrives with many errors, along with missing or inconsistent values throughout. The essential procedure, known as Data wrangling, transforms raw data into a format ready for analysis. This includes: Handling missing values Encoding categorical variables Normalising datasets Dealing with outliers Sophisticated computer models yield poor outcomes when operating on unclean data sets. Your algorithms require strong foundations, which data wrangling helps establish. Machine Learning and Deep Learning Data scientists build and deploy machine learning models to resolve complicated problems in their field. As automation and predictive analytics become mainstream, proficiency in: Supervised and unsupervised learning Neural networks Decision trees Support vector machines Ensemble methods like random forests and XGBoost is highly valuable. Using frameworks like Scikit-learn, TensorFlow, and PyTorch with other skills enables users to construct and optimise models which become applicable to practical solutions. Data Visualisation and Storytelling A leader requires both essential specialised abilities and communication expertise, which requires proficiency in delivering results. Data visualisation tools help translate complex analysis into visuals that stakeholders can understand and act upon. Popular tools include: Tableau Power BI Matplotlib and Seaborn (Python) The ability to tell data stories stands as the fundamental yet underestimated capability for data scientists to connect technology-focused teams with organisational leadership groups. Cloud Computing and Deployment Local machines prove inadequate to process the growing amount of big data that appears in the market today. The current data storage infrastructure depends on central storage platforms operated by Amazon Web Services (AWS) and Google Cloud Platform (GCP), and Microsoft Azure. Skills to master: Using cloud-based notebooks Data pipeline creation Deploying machine learning models Serverless computing Cloud fluency stands as a mandatory skill in 2025. Large-scale data processing stands as an essential function within the data scientist role. Tools such as: Apache Hadoop Apache Spark Kafka Hive These tools operate efficiently when handling data amounts exceeding terabytes and petabytes. Data scientists who learn these tools can accept positions in enterprise environments handling data beyond traditional processing capability. Technological excellence becomes useless when it lacks understanding of the surrounding system. Whether you're working in healthcare, finance, e-commerce, or manufacturing, understanding the domain: Enhances the relevance of models Improves decision-making Helps in identifying key variables and performance indicators Your knowledge of such domain-specific laws, like compliance and risk modelling, will distinguish your abilities in fintech applications. Technical professionals obtain business problem-solving abilities when they acquire domain knowledge. Communication and Collaboration Mastering data science skills does not substitute for teamwork skills because data scientists must collaborate with multiple teams and departments. Data scientists regularly partner with product managers and engineers while working with marketing teams and executives on their projects. Key interpersonal skills include: Clear written and verbal communication Active listening Adaptability Presentation skills Being data-driven requires companies to explain complex models to people who do not have technical backgrounds. Lifelong Learning and Certification Data science continues to develop at a quick pace. What stands as a breakthrough in technology will lose its novelty in the future. Success demands an absolute commitment to a habit of continuous learning. A structured data science course stands as the best method to gain structured knowledge while maintaining pace with new tools and gathering a professional network. Certifications, projects, and hands-on practice help validate your expertise and improve job readiness. Data Scientist Salary in India What attracts many to this field is the generous compensation it provides. Indian companies determine data scientist salaries through a combination of skills and work experience. On average: Entry-level data scientists earn ₹6–10 LPA Mid-level professionals make ₹10–20 LPA The pay scale for senior data scientists and specialists reaches up to ₹25 LPA Data scientists who specialise in NLP, computer vision, or cloud deployment technologies receive elevated salary ranges in their profession. Ready to Launch Your Data Science Career? Start your learning path at Imarticus Learning to develop into the data scientist society demands for 2025. Begin your journey toward success by exploring today the advanced data science training options at Imarticus Learning. Ahmedabad Plane Crash
Yahoo
13-06-2025
- Business
- Yahoo
Trump's DOGE Initiative Slammed As 'Broke Humpty Dumpty' While Former Insider Sahil Lavingia Spills Shocking Details: No Salary, No Role, Chaotic Work Culture And Abrupt Exit After Just 55 Days
Silicon Valley entrepreneur Sahil Lavingia, founder of Gumroad, provides a rare inside look at the Department of Government Efficiency (DOGE), previously led by Elon Musk. What Happened: On the Hard Fork podcast, recorded over a week ago, Lavingia discussed his 55-day stint at DOGE, embedded in the Department of Veterans Affairs. In the nearly hour-long podcast, Lavingia highlighted the chaotic and dysfunctional nature of DOGE since its early days. What was pitched as a bold effort to inject Silicon Valley know-how into the federal government instead turned into a disorganized and opaque operation, he says. Trending: Maker of the $60,000 foldable home has 3 factory buildings, 600+ houses built, and big plans to solve housing — 'There was no offer letter, no salary details, nothing.' Weeks into the job, he still didn't know how much he was being paid, 'I assume it's zero,' he says. The White House did not immediately respond to Benzinga's request for comment. Assigned to the VA without clarity on his role or direct reporting line, Lavingia was given a government laptop that couldn't run Python or Git. 'It was like being asked to cook with no equipment,' he said, describing the limits placed on DOGE engineers in federal environments. Despite the chaos, Lavingia said he intended to reduce inefficiencies without harming services to veterans. However, he noted DOGE's primary mandate focused on slashing contracts and reducing headcount, not shipping software or improving user experience. Lavingia describes DOGE's internal culture as being fundamentally at odds with the Federal workforce. 'I joked I was here to RIF everybody,' he said, referring to the Reduction in Force initiative and all he got in return was 'Dead silence.' He says his time at the department came to an abrupt end after speaking with a journalist. 'I was ghosted,' he says, after his access to GitHub was revoked without any concludes by saying that the effort was naive but well-intentioned. 'Mistakes were made,' he said. 'Hopefully, I was more the baby than the bathwater,' which is a play on the old saying 'Don't throw the baby out with the bathwater,' which means to say not to discard something valuable when trying to get rid of something undesirable. Why It Matters: Donald Trump's DOGE initiative has come under a lot of criticism in recent months, with the likes of Mark Cuban warning that 'this isn't a corporate turnaround, this is the United States of America.' Cuban argued that while he was in favor of improving efficiencies and cutting waste in government, he would approach it with a plan, and 'ready Fire Aim is not a plan' is not a plan. Several policy experts, such as Erik Nisbet of Northwestern University, have slammed DOGE as being 'very, very harmful,' and even 'extra governmental.' Nisbet compared its effects to 'a broke Humpty Dumpty,' and wondered if its damage could ever be undone. Meanwhile, Musk's brief tenure at DOGE was marked by aggressive job cuts across federal agencies, the dismantling of USAID and a series of controversial communications, including 'fork in the road' resignation emails. Read Next: The average American couple has saved this much money for retirement — How do you compare? Jeff Bezos-Backed Arrived Homes Hits A Big Sale On Charlotte Property – Investors Earning A 34.7% ReturnPhoto courtesy: Tada Images / UNLOCKED: 5 NEW TRADES EVERY WEEK. Click now to get top trade ideas daily, plus unlimited access to cutting-edge tools and strategies to gain an edge in the markets. Get the latest stock analysis from Benzinga? This article Trump's DOGE Initiative Slammed As 'Broke Humpty Dumpty' While Former Insider Sahil Lavingia Spills Shocking Details: No Salary, No Role, Chaotic Work Culture And Abrupt Exit After Just 55 Days originally appeared on © 2025 Benzinga does not provide investment advice. All rights reserved. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Techday NZ
12-06-2025
- Business
- Techday NZ
Harness launches IDP 2.0 to boost developer speed & security
Harness has launched version 2.0 of its Internal Developer Portal (IDP) with a suite of updates designed to improve software delivery speed, quality, security, and the overall developer experience at enterprise scale. The latest release builds on the Backstage framework, a Cloud Native Computing Foundation project, and rolls out new features directed at large organisations. The update targets issues seen in previous portals, such as managing complexity, providing scalablility, and unifying fragmented developer experiences. Enterprise-focused enhancements Key additions in the updated Harness IDP include fine-grained Role-Based Access Control (RBAC), which provides tighter security and compliance. The RBAC system allows platform teams to specify exactly who can read, create, edit, delete, or execute particular services or workflows at a granular level. For companies in regulated sectors such as finance or healthcare, this is an essential tool for governance. The release notes, "In large organisations, access boundaries must be explicit. IDP now supports entity-level granular Role-Based Access Control (RBAC), so platform teams can define exactly who can read, create, edit, delete, or execute a given service or workflow. For companies in regulated industries, like financial services or healthcare, this level of control is essential for compliance and risk management. Imagine a company whose services require restricted access due to regulatory constraints. With RBAC, only the compliance engineering team can modify those entities, while broader developer groups can view, but not alter, related documentation or dependencies." The updated portal has also been integrated with real-time Git synchronisation, supporting webhooks for immediate updates to YAML configuration files. This eliminates the need for polling or manual refreshing and supports OAuth and central tokens for flexible authentication across all major Git providers. Harness explains, "For organisations with large, distributed engineering teams, keeping the developer portal in sync with Git can quickly become a bottleneck. Harness IDP now supports real-time updates via webhooks when YAML config files are updated in Git, eliminating the need for polling or manual refreshes. Teams can also make edits directly in the portal UI and push those changes back however they prefer, either directly or through a pull request. Authentication is flexible, with support for OAuth and central tokens, and it works with all major Git providers." Usability for large teams The new release aligns the entity organisation in the Harness-Native Platform Hierarchy, mapping catalog entities and workflows to real-world structures such as project, organisation, or account-level groupings. This enables tailored visibility for teams according to their functional or geographic targets, reducing clutter and the risk of error. Harness states, "Harness IDP now aligns with our native platform hierarchy, enabling teams to create catalog entities and workflows at the Project, Organisation, or Account level. This mirrors how engineering teams are actually structured - by product line, business unit, or geography - so developers see only what's relevant to their work." In addition to architectural changes, the user interface of the portal's catalog has been redesigned for greater clarity and efficiency. Developers are now able to filter services based on specific relevance, such as ownership or technology, and view integrated scorecards within the catalog. Metrics include service maturity, security compliance, and readiness for production. On this point, the company shared the comment, "With this release, the IDP catalog has been redesigned for speed, clarity, and scale. Teams can now filter views based on what matters to them, like services they own or APIs used across the organisation. Scorecards are now built directly into the catalog view, giving developers and platform teams immediate visibility into key metrics like service maturity, security standards alignment, and production readiness. Each entity page clearly shows scope, ownership, and references, making it easier for teams to stay organised and aligned." Onboarding and automation The update introduces a guided, form-based approach to creating and managing catalog entities, in addition to continued support for YAML-in-Git workflows. The shift is aimed at easing barriers for engineers unfamiliar with configuration syntax and fostering wider platform adoption. The company remarked, "Catalog entities can now be created and managed directly through the Harness IDP UI using a guided, form-based experience – no YAML required. This removes a major barrier for developers unfamiliar with configuration syntax, making it easier for more teams to get started and contribute. For those who prefer a config-as-code workflow, the traditional YAML-in-Git approach is still fully supported." For larger-scale organisations that depend on automation, Harness IDP now offers additional APIs allowing automatic catalog entity creation, auto-discovery, CLI integration, and Terraform provider support. Harness noted, "While UI is great for onboarding or making quick updates, large-scale adoption often demands automation. Harness IDP now includes new APIs to create and manage catalog entities, unlocking use cases like auto-discovery, auto-population, CLI integration, and Terraform provider support. The existing Catalog Ingestion API remains unchanged and will continue to function as before." Backstage plugin compatibility The IDP continues to extend compatibility with Backstage open-source plugins and supports teams seeking to build bespoke plugins on the framework. The company wrote, "Harness IDP continues to extend the Backstage open-source framework, so teams can keep using the Backstage plugin ecosystem they already know, or build their own custom plugins. Best of both worlds!" "With this major release of Harness IDP, we're redefining what an enterprise-grade internal developer portal can be. Extended from the Backstage framework and supercharged for scale, Harness IDP now delivers real-time Git synchronisation, true org-level hierarchy, API-first extensibility, and the most powerful RBAC system in the category. Whether you're supporting 100s or 1,000s of developers, Harness IDP gives platform teams the structure, speed, and control they need to transform developer experience at the enterprise level – without compromise." The series of updates are positioned to support organisations looking to scale software delivery without increasing risk or operational overhead for developer and platform teams.


Arabian Post
09-06-2025
- Business
- Arabian Post
Canonical Ends Bazaar Hosting on Launchpad
Canonical will cease all Bazaar code hosting on its Launchpad platform in two stages, culminating on 1 September 2025. The legacy version control system, once the backbone of many Ubuntu-related projects, will see its web interface retired soon, followed by full removal of backend functionality. Users of Bazaar, including developers relying on Ubuntu Engineering, must migrate to Git or other supported systems ahead of the deadline to preserve continuity. Bazaar, created by Martin Pool and sponsored by Canonical, never matched Git in popularity. With its last stable release in 2016, it gradually lost traction among open‑source communities. Today, Git has become the standard, hosting the vast majority of collaborative software development activity. Canonical itself acknowledged that maintaining Bazaar consumed significant development, operational, and infrastructure resources, resources now better allocated to modernising Ubuntu and Launchpad. Launchpad's rollback of Bazaar support will begin with the immediate shutdown of the Loggerhead web frontend, used for browsing Bazaar code repositories. Canonical cited declining legitimate traffic, with much of the web interface usage now coming from scrapers and automated bots. At this stage, developers will still be able to interact with repositories via command‑line tools, with pushes, pulls, and merges unaffected. The second phase, starting 1 September 2025, will eliminate the Bazaar backend entirely. After this date, Launchpad will no longer host Bazaar repositories, meaning developers cannot push, pull, merge, or browse code via Bazaar. Canonical has urged all users to migrate their code before this shutdown to avoid service disruption. ADVERTISEMENT Migration instructions have been made available on Ubuntu's Discourse platform and Launchpad's documentation site. The recommended method relies on native Bazaar‑to‑Git interop, using tools like 'brz push' that convert Bazaar revisions into Git history. Users have reported this process to be slower but more reliable than older export‑import methods. Not all Bazaar users are fluent with Git. In community discussions, one long‑time developer lamented that 'I love the simplicity of bazaar/launchpad… I really do not get git.' Another emphasised the invaluable contributions of Jezmer Vernooij, the maintainer of the Breezy fork, describing him as 'probably the most tangible act of generosity that can be made among strangers in the open source world'. Ubuntu and Launchpad gained prominence through Bazaar because it was once the only version control system supported for packaging and PPAs. Over time, Git's features, performance, branching model, and ecosystem—spanning GitHub, GitLab, and Bitbucket—made it the clear choice for modern development. Canonical highlighted its decision to deprecate Bazaar as part of its broader effort to modernise development workflows. By reallocating resources from maintaining outdated infrastructure like Bazaar, Canonical intends to better support Ubuntu's core development and implement improvements to Launchpad as a whole. Despite the shift, Bazaar will not disappear entirely from the world of open source. Users who wish to continue using Bazaar beyond Launchpad's support cutoff can host their repositories with services like GNU Savannah, which remains committed to Bazaar support. Breezy, the active fork of Bazaar, will also continue to receive maintenance, ensuring the version control system endures for those who prefer it. ADVERTISEMENT The discontinuation of Bazaar on Launchpad marks a significant moment in the history of Ubuntu's development tools. Once tightly integrated into canonical workflows for building DEBs, PPAs, snaps and Ubuntu itself, Bazaar's sundering from Launchpad symbolises the retreat of niche VCS in favour of universally supported tools. It speaks to broader shifts in software development culture, aligning Ubuntu with prevailing industry practices centred on Git. Canonical has emphasized that Ubuntu Engineering will receive migration support, and developers with unique needs are encouraged to reach out via Launchpad's feedback channels or Matrix. The company aims to collaborate closely to remove reliance on Bazaar-specific integrations used in Ubuntu's engineering systems. As the 1 September deadline approaches, developers must act swiftly to export their repositories. Migrating preserves their revision histories, branches, and tags, ensuring continued project development. Those who delay risk losing remote access to their code and may face complex manual recovery efforts after Bazaar support ends. Bazaar's sunset also underscores the dominance of Git in open‑source workflows. According to the 2024 Stack Overflow developer survey, approximately 98 per cent of developers use Git, everyone from hobbyists to large enterprises. Git's extensive ecosystem of CI/CD tools, integrations, and community support has entrenched it as the developer standard. Despite familiarity with Bazaar among legacy projects, the broader open‑source ecosystem has migrated towards Git. Organisations seeking to maintain compatibility with the wider community, attract contributors, and leverage automated development pipelines will find Git essential. Enterprises relying on Bazaar must reevaluate their infrastructure and workflows to align with this reality. Canonical's decision reflects both pragmatic resource allocation and alignment with community norms. By removing Bazaar support, it simplifies the development stack, reduces maintenance burden, and clarifies the path forward for Ubuntu's development ecosystem. While the transition brings uncertainty for long‑time Bazaar users, structured migration pathways and continued community support via Breezy and alternative hosts offer continuity. This move also signals potential future efforts by Canonical to deprecate other legacy services on Launchpad, focusing the platform on components with active developer and user bases. By streamlining services, Canonical may enhance Launchpad's relevance in contemporary software engineering workflows.


Hans India
04-06-2025
- Business
- Hans India
OpenAI Brings Internet-Enabled Codex to ChatGPT Plus Users
OpenAI has expanded the availability of its powerful AI coding assistant, Codex, making it accessible to ChatGPT Plus users. Previously exclusive to Enterprise, Team, and Pro users, Codex is now within reach for individual developers and smaller teams, offering advanced features for coding, debugging, and testing. One of the most significant upgrades in this rollout is Codex's ability to access the internet during programming tasks. This enhancement enables the AI to perform more dynamic functions like installing dependencies, fetching resources, and interacting with live staging servers. However, in the interest of security, internet access is disabled by default. Users must manually enable it and can customize permissions by specifying which domains Codex may reach and which HTTP methods it can use. Security remains a top priority. OpenAI has implemented systems to monitor for prompt injection attacks—attempts to trick the AI into performing unintended web actions. These safeguards help ensure that Codex's new online capabilities do not compromise safety or user control. In a bid to improve accessibility and user convenience, OpenAI has also added voice input support. Developers can now issue spoken instructions to Codex, allowing for hands-free interaction and improved accessibility for users with disabilities or those who prefer voice commands over typing. Codex has also become smarter in handling Git workflows. Previously, the AI would generate a new pull request with each change. Now, it can update existing ones, which streamlines collaboration and reduces repository clutter during ongoing development work. Performance-wise, the assistant is also getting a behind-the-scenes boost. OpenAI has optimized setup scripts, enhanced support for iOS, and improved Codex's integration with GitHub. The sign-in process for users using social logins or single sign-on has been simplified by removing the two-factor authentication requirement. Originally launched in May, Codex is embedded directly into ChatGPT and designed to handle a wide range of development tasks. These range from generating new features to fixing bugs and answering complex technical questions. Each project runs in an isolated environment (or sandbox) to maintain safety and prevent cross-task interference. Codex is powered by a specialized version of OpenAI's o3 model, which has been trained using reinforcement learning on real-world software development projects. This allows the AI to generate code that closely mimics human writing styles, particularly in collaborative environments like pull requests. To start using Codex in ChatGPT Plus, users can access it via the sidebar. Tasks are assigned by selecting the 'Code' option after entering a prompt. There's also an 'Ask' feature that lets users request help with specific parts of their codebase. Each task runs in its own dedicated workspace and typically takes between 1 and 30 minutes to complete, depending on complexity, with live progress updates visible throughout. OpenAI's decision to open up Codex to a broader audience signals a push to empower more developers with advanced AI tools, making coding more efficient, interactive, and accessible.