PoddsändningarNyheterTech Talks Daily

Tech Talks Daily

Neil C. Hughes
Tech Talks Daily
Senaste avsnittet

2336 avsnitt

  • Tech Talks Daily

    How Phenom Is Using AI To Transform Hiring And Talent Intelligence

    2026-03-09 | 24 min.
    How can organizations use AI to transform hiring while still protecting the human element at the heart of work?
    In this episode of Tech Talks Daily, I sit down with Mahe Bayireddi, co-founder and CEO of Phenom, to explore how artificial intelligence is reshaping the way companies attract, hire, and develop talent. 
    Our conversation comes at an interesting moment for the company, following the announcement that Phenom has acquired Be Applied, an AI-driven cognitive assessment platform designed to validate candidate and employee capabilities at scale. The move follows an earlier acquisition of Included, an AI-native people analytics platform focused on delivering deeper workforce insights and faster decision making.
    Mahe shares how Phenom's long-term mission to help a billion people find the right job is evolving as AI becomes embedded throughout the HR lifecycle. From candidate discovery to onboarding and internal mobility, organizations are now experimenting with automation, personalization, and intelligent workflows that aim to improve both productivity and employee experience.
    One theme that runs throughout our discussion is how AI adoption in HR varies dramatically depending on geography, regulation, and industry. In Europe, regulatory frameworks are shaping how companies deploy automation. In the United States, state-level policies introduce additional complexity. Meanwhile, organizations across Asia are often approaching AI with entirely different priorities. As a result, many global companies are experimenting carefully, introducing AI into specific business units or regions before rolling it out more broadly.
    We also talk about a challenge that has caught many HR teams by surprise: the growing issue of fraudulent candidates and identity manipulation in the hiring process. As job applications become easier to submit and remote work expands global talent pools, organizations must rethink how they validate candidate identity and credentials. Mahe explains how AI-driven fraud detection tools can help highlight suspicious patterns while still keeping humans in the loop for final decisions.
    Another important point raised in the conversation is the need to preserve humanity in the workplace while introducing intelligent automation. While AI can dramatically improve efficiency across recruiting and workforce planning, Mahe believes HR leaders must be careful to ensure technology strengthens human potential rather than reducing people to data points in a system.
    Looking ahead, we discuss how organizations can begin adopting AI responsibly by starting small, focusing on high-impact areas, and building guardrails that reflect regional regulations and company culture. For many companies, the most successful path forward will involve testing AI within specific workflows, measuring outcomes quickly, and scaling what works.
    So as artificial intelligence becomes a central part of hiring, workforce planning, and employee development, the big question for leaders is this. Can organizations use AI to create faster, smarter talent decisions while still keeping people at the center of the workplace experience?
  • Tech Talks Daily

    How CISOs Can Earn Real Influence In The Boardroom With Rapid7

    2026-03-08 | 28 min.
    How does a CISO turn cybersecurity from a technical conversation into a business conversation that boards actually care about?
    In this episode of Tech Talks Daily, I sit down with Thom Langford, EMEA CTO at Rapid7 and a former CISO, to explore what he calls the second phase of cybersecurity leadership. For years, the industry worked hard to secure a seat at the boardroom table. In many organizations, that mission has largely succeeded. But as Thom explains, gaining access was only the first step. The real challenge now is communicating security in a way that drives meaningful business decisions.
    Thom shares why many CISOs still approach board conversations in the same way they did a decade ago, even though boardroom awareness of cybersecurity has changed dramatically. Today, many boards include members with cybersecurity knowledge or direct security experience. That means security leaders can no longer rely on technical jargon, complex frameworks, or compliance language to make their case.
    One of the most interesting insights from our conversation is the disconnect between how CISOs frame risk and what boards are actually focused on. While security teams often lead with risk reduction, boards tend to think in terms of revenue growth and operational costs. Thom argues that security leaders must learn to translate cybersecurity into the language of profit and loss if they want their message to resonate at the executive level.
    We also explore how traditional security tools such as risk frameworks, audits, and compliance standards can sometimes create distance rather than clarity in board discussions. Instead of helping executives understand security priorities, these models can obscure the real question boards are trying to answer. How secure are we, and what does that mean for the business?
    Another area we discuss is the growing role of tabletop exercises. Thom explains why these simulations are becoming one of the most effective ways for CISOs to demonstrate the real-world impact of security decisions. By walking executives through a realistic incident scenario, leaders can see how security, operations, legal teams, and business priorities intersect during a crisis.
    Looking ahead, Thom believes the most successful CISOs will increasingly need to think like business leaders rather than purely technical specialists. Communication skills, relationship building, and understanding the organization's financial priorities may prove just as important as deep technical expertise.
    So if cybersecurity leaders have already earned their place in the boardroom, the next question becomes much more interesting. Are they speaking the language the board actually understands, or are they still trying to solve business problems using only security vocabulary?
  • Tech Talks Daily

    How Shokz Is Leading The Rise Of Open-Ear Headphones

    2026-03-08 | 27 min.
    *]:pointer-events-auto scroll-mt-(--header-height)" dir="auto" tabindex="-1" data-turn-id= "27410bb7-d8c2-4aff-a02b-e0633691b6f9" data-testid= "conversation-turn-13" data-scroll-anchor="false" data-turn="user"> *]:pointer-events-auto scroll-mt-[calc(var(--header-height)+min(200px,max(70px,20svh)))]" dir="auto" tabindex="-1" data-turn-id= "request-WEB:ab9c27ca-16aa-4b93-8b31-786a9be386c3-6" data-testid= "conversation-turn-14" data-scroll-anchor="true" data-turn= "assistant"> What if the next big shift in personal audio is not about blocking the world out, but staying connected to it?
    In this episode of Tech Talks Daily, I sit down with Nicole from Shokz to talk about why open-ear headphones are suddenly everywhere, and why this category is moving from niche curiosity to everyday essential. For years, the audio market was obsessed with sealing users off from the outside world. Now the conversation is changing. More people want to hear their music, podcasts, and calls without losing awareness of traffic, fellow commuters, colleagues, or the world happening around them.
    Nicole helps unpack what open-ear audio actually means in simple terms, and why it is resonating with runners, commuters, parents, office workers, and anyone trying to balance comfort, safety, and sound quality. We talk about the cultural shift behind this rise, from growing health and fitness habits to the way hybrid work and always-on lifestyles have changed how people use earbuds throughout the day.
    We also get into why Shokz has become one of the defining brands in this space. Long before open-ear audio became a trend, Shokz was investing in bone conduction, open-ear design, and the kind of product research needed to make this category work in real life. Nicole shares how years of persistence, technical innovation, and consumer education helped the company move from specialist player to category leader.
    During our conversation, we explore how real-world behavior shapes product design. That means thinking beyond audio specs and focusing on how headphones actually fit into daily life. Whether someone is running in the rain, commuting to work, wearing glasses, sitting in an office, or trying to stay aware while walking the dog, those everyday moments are shaping the next generation of audio devices.
    Nicole also talks me through some of Shokz's latest product thinking, including the OpenDots One and the OpenFit Pro. From compact clip-on designs that feel almost like wearable accessories to new approaches around noise reduction in open-ear listening, this episode looks at how the category is becoming more sophisticated and more versatile without losing the awareness that made it appealing in the first place.
    Looking ahead, we discuss whether open-ear audio will live alongside sealed earbuds as part of a two-device lifestyle, or whether it could eventually become the default choice for more people. We also touch on what comes next, from smarter audio experiences to the role AI and even connected glasses could play in the future of listening.
    So if you have been seeing the phrase open-ear audio more often and wondering what all the fuss is about, this conversation will bring it to life. Are open-ear headphones simply having a moment, or are we watching a bigger shift in how people want to hear the world around them?
  • Tech Talks Daily

    d-Matrix - Ultra-low Latency Batched Inference for Gen AI

    2026-03-07 | 26 min.
    What happens when the real bottleneck in artificial intelligence is no longer training models, but actually running them at scale?
    In this episode of Tech Talks Daily, I sit down with Satyam Srivastava from d-Matrix to explore a shift that is quietly reshaping the entire AI infrastructure landscape. While much of the early AI race focused on training ever larger models, the next phase of AI adoption is increasingly defined by inference. That is the moment when trained models are deployed and used to generate real-world results millions of times a day.
    Satyam brings a unique perspective shaped by years of experience in signal processing, machine learning, and hardware architecture, including time spent at NVIDIA and Intel working on graphics, media technologies, and AI systems. Now at d-Matrix, he is helping design next-generation computing architectures focused on one of the biggest challenges facing the AI industry today: efficiently running large language models without overwhelming data centers with unsustainable power and infrastructure demands.
    During our conversation, we explored why the industry underestimated the infrastructure implications of inference at scale. While training large models grabs headlines, the real operational pressure often comes later when those models must serve millions of queries in real time. That shift places enormous strain on memory bandwidth, energy consumption, and data movement inside modern data centers.
    Satyam explains how d-Matrix identified this challenge years before generative AI exploded into the mainstream. Instead of focusing on training hardware like many AI startups at the time, the company concentrated on inference efficiency. That decision is becoming increasingly relevant as organizations begin to realize that simply adding more GPUs to data centers is not a sustainable long-term strategy.
    We also discuss the growing power constraints surrounding AI infrastructure, and why efficiency-driven design may be the only realistic path forward. With electricity supply, cooling capacity, and semiconductor availability all becoming limiting factors, the industry is being forced to rethink how AI systems are architected. Custom silicon, purpose-built accelerators, and heterogeneous computing environments are now emerging as key pieces of the puzzle.
    The conversation also touches on the geopolitical and economic importance of AI semiconductor leadership, and why the relationship between frontier AI labs, infrastructure providers, and chip designers is becoming increasingly strategic. As governments and companies compete to maintain technological leadership, the question of who controls the hardware powering AI may prove just as important as the models themselves.
    Looking ahead, Satyam shares his perspective on how the role of engineers will evolve as AI infrastructure becomes more specialized and energy-aware. Foundational engineering skills remain essential, but the next generation of engineers will also need to think in terms of entire systems, combining software, hardware, and AI tools to build more efficient computing environments.
    As AI continues to move from research labs into everyday products and services, are organizations prepared for the infrastructure shift that comes with an inference-driven future? And could efficiency, rather than raw computing power, become the defining metric of the next phase of the AI race?
  • Tech Talks Daily

    How Scale Computing Is Powering The Next Wave Of Edge Infrastructure

    2026-03-07 | 21 min.
    How should businesses rethink infrastructure when applications, data, and users are increasingly spread across thousands of locations?

    In this episode of Tech Talks Daily, I sit down with Mark Cree, President and Chief Operating Officer at Scale Computing, to talk about why the future of enterprise infrastructure is moving closer to where data is actually created.
    This conversation was recorded following the 66th edition of The IT Press Tour, where some of the most interesting conversations in enterprise infrastructure centered on what happens when businesses move away from oversized, monolithic stacks and start focusing on practical, distributed solutions. From retail stores and airports to remote industrial sites, the edge is becoming a critical part of modern IT strategy.
    Mark shares how Scale Computing has spent years building an edge-first platform designed to run critical workloads reliably across everything from a single location to tens of thousands of distributed sites.
    Mark also reflects on his own journey through the technology industry, which includes founding companies acquired by Cisco and NetApp, working as a venture capitalist, and leading major storage initiatives at AWS. That experience gives him a unique perspective on how enterprise infrastructure has evolved, particularly as organizations reconsider the balance between centralized cloud environments and local processing closer to users and devices.
    During our conversation, we explore why edge computing is becoming increasingly important for AI workloads, especially when large volumes of data are generated outside traditional data centers. Mark explains how processing information locally can reduce costs, improve performance, and enable entirely new use cases, from monitoring customer behavior in retail environments to running intelligent systems in remote locations.
    We also talk about the ongoing reassessment happening across enterprise IT teams following major industry shifts, including changes in the virtualization market and growing concerns around vendor lock-in. Mark explains how Scale Computing is positioning itself as a flexible alternative by combining virtualization, containerization, networking, and security into a platform designed specifically for distributed environments.
    Looking ahead, Mark shares his perspective on where enterprise infrastructure is heading over the next five years. As smaller AI models become more capable and organizations seek greater control over their data and systems, the role of edge platforms may become even more important.
     Instead of relying solely on massive centralized environments, companies may find new value in distributing intelligence closer to the places where real-world activity happens.
    So as organizations rethink how they deploy applications, manage data, and control infrastructure, is the next big shift in enterprise IT happening right at the edge? And how prepared is your organization for that change?

Fler podcasts i Nyheter

Om Tech Talks Daily

If every company is now a tech company and digital transformation is a journey rather than a destination, how do you keep up with the relentless pace of technological change? Every day, Tech Talks Daily brings you insights from the brightest minds in tech, business, and innovation, breaking down complex ideas into clear, actionable takeaways. Hosted by Neil C. Hughes, Tech Talks Daily explores how emerging technologies such as AI, cybersecurity, cloud computing, fintech, quantum computing, Web3, and more are shaping industries and solving real-world challenges in modern businesses. Through candid conversations with industry leaders, CEOs, Fortune 500 executives, startup founders, and even the occasional celebrity, Tech Talks Daily uncovers the trends driving digital transformation and the strategies behind successful tech adoption. But this isn't just about buzzwords. We go beyond the hype to demystify the biggest tech trends and determine their real-world impact. From cybersecurity and blockchain to AI sovereignty, robotics, and post-quantum cryptography, we explore the measurable difference these innovations can make. Whether improving security, enhancing customer experiences, or driving business growth, we also investigate the ROI of cutting-edge tech projects, asking the tough questions about what works, what doesn't, and how businesses can maximize their investments. Whether you're a business leader, IT professional, or simply curious about technology's role in our lives, you'll find engaging discussions that challenge perspectives, share diverse viewpoints, and spark new ideas. New episodes are released daily, 365 days a year, breaking down complex ideas into clear, actionable takeaways around technology and the future of business.
Podcast-webbplats

Lyssna på Tech Talks Daily, Aftonbladet Daily och många andra poddar från världens alla hörn med radio.se-appen

Hämta den kostnadsfria radio.se-appen

  • Bokmärk stationer och podcasts
  • Strömma via Wi-Fi eller Bluetooth
  • Stödjer Carplay & Android Auto
  • Många andra appfunktioner

Tech Talks Daily: Poddsändningar i Familj

Sociala nätverk
v8.7.2 | © 2007-2026 radio.de GmbH
Generated: 3/9/2026 - 7:26:18 AM