Tech Resources I love!

Preparing for technical interviews

Ace Every Stage of Your Next Technical Interview with these curated resources

Courses on Cloud, Data and AI

Step by step courses with hands-on experience and projects

Vertical AI is the New SaaS
Akhil Mohan Akhil Mohan

Vertical AI is the New SaaS

Vertical AI is the New SaaS

Sep 3 

Written By Akhil Mohan

How industry-specific AI is reshaping enterprise software just like SaaS transformed computing two decades ago

Twenty years ago, I witnessed the SaaS revolution firsthand. Companies were struggling with expensive on-premise software deployments, complex maintenance cycles, and rigid licensing models. Then came Salesforce, Workday, and ServiceNow—not just offering better software, but fundamentally reimagining how businesses consume technology. They didn't just digitize existing processes; they created entirely new paradigms around subscription models, cloud-native architectures, and continuous delivery.

Today, we're standing at a similar inflection point with Artificial Intelligence. But this time, the transformation isn't about moving from desktop to cloud—it's about moving from horizontal, general-purpose AI to Vertical AI: purpose-built intelligent systems designed for specific industries and workflows.

The Horizontal AI Foundation: Necessary but Not Sufficient

The current AI landscape is dominated by horizontal platforms—ChatGPT, Claude, Gemini, and their enterprise variants. These models are remarkable achievements, capable of reasoning across domains, generating content, and solving problems with unprecedented sophistication. They've democratized access to AI capabilities and proven the transformative potential of large language models.

However, much like the early days of computing when we had powerful but generic mainframes, horizontal AI faces inherent limitations when applied to specialized enterprise use cases:

Data Limitations: General models are trained on publicly available internet data, missing the proprietary datasets that drive real business value—medical records, legal precedents, financial transactions, manufacturing telemetry.

Compliance Gaps: Industries like healthcare (HIPAA), finance (SOX, GDPR), and legal services operate under strict regulatory frameworks that generic AI models weren't designed to navigate.

Workflow Friction: Horizontal AI requires users to adapt their processes to the tool, rather than embedding intelligence seamlessly into existing professional workflows.

Limited Context: Without deep domain knowledge, even the most sophisticated general models struggle with industry-specific nuances, terminology, and decision-making frameworks.

Vertical AI is the New SaaS

What is Vertical AI Revolution

Vertical AI represents the natural evolution beyond horizontal platforms—AI systems built from the ground up for specific industries, trained on domain-specific datasets, and designed to integrate natively into professional workflows. The market data strongly supports the rising importance of vertical AI solutions. Vertical AI is on the rise, with this year's vertical winners surpassing the other category winners to capture over $1B in combined funding in 2025 YTD AI 100: The most promising artificial intelligence startups of 2025 - CB Insights Research, according to CB Insights' AI 100 report. The healthcare sector exemplifies this growth trajectory, with the global AI in healthcare market size estimated at USD 26.57 billion in 2024 and projected to reach USD 187.69 billion by 2030, growing at a CAGR of 38.62% AI In Healthcare Market Size, Share | Industry Report, 2030.

Key Differentiators of Vertical AI:

1. Specialized Training Data While horizontal AI models train on broad internet corpora, Vertical AI systems ingest industry-specific datasets: clinical trial data for healthcare AI, case law databases for legal AI, financial market data for fintech AI. This specialized training creates models that understand industry context, terminology, and decision-making patterns.

2. Regulatory Compliance by Design Rather than retrofitting compliance, Vertical AI systems are architected with regulatory requirements as first-class constraints. Healthcare AI models are HIPAA-compliant from inception; financial AI systems are built with SOX controls and audit trails embedded.

3. Workflow-Native Integration Instead of requiring users to switch contexts, Vertical AI embeds directly into the tools professionals already use—EMR systems for doctors, case management platforms for lawyers, trading platforms for financial professionals.

4. Domain-Specific Performance By focusing on narrow use cases within specific industries, Vertical AI systems can achieve superhuman performance in their domains while maintaining explainability and auditability.

Read More
How AI is Reshaping Consulting
Akhil Mohan Akhil Mohan

How AI is Reshaping Consulting

The consulting industry has long been the intellectual backbone of business transformation — guiding organizations through complexity using human expertise, frameworks, and playbooks. But today, something unprecedented is underway: the rise of Artificial Intelligence as both a disruptor and an amplifier of consulting.

The headlines are provocative: “Is consulting becoming irrelevant in the age of AI?”
The reality is the opposite. AI isn’t making consulting obsolete — it’s fundamentally reshaping how consultants create value, deliver outcomes, and engage with clients.

The bottom line: Consulting is shifting from a purely advisory service to a technology-enabled, implementation-focused discipline — one that delivers measurable impact faster than ever before.

Read More
5 AI Driven Cloud Defense: A Step-by-Step Guide
Akhil Mohan Akhil Mohan

5 AI Driven Cloud Defense: A Step-by-Step Guide

As organizations increasingly depend on the cloud, the playbook for cyber defense is undergoing a foundational shift. The sheer scale, complexity, and velocity of today’s threats mean traditional approaches can’t keep up. Artificial intelligence is a true game-changer in cloud security, unlocking smarter, faster, and more agile protections. Here’s how AI is remaking the future of cloud defense, step by step.

The 5 AI-Driven Moves Elevating Cloud Security – From smart detection to rapid resilience, these steps define tomorrow’s digital defenses.


1. Intelligent Threat Detection

AI instantly spots suspicious activity to keep threats out.

Modern cloud security starts with vigilance. AI powerfully analyzes activity across your cloud in real time, using advanced pattern recognition to catch anomalies and emerging threats other tools might miss. This means the earliest possible warning—so breaches can be stopped before they begin.

2. Rapid AI Driven Response

AI launches defenses the moment danger is detected.

When milliseconds matter, speed is everything. AI doesn’t wait for manual intervention: as soon as a threat is flagged, it can instantly activate automated defenses, from locking accounts to isolating risky environments. This rocket-fast action transforms threat response from minutes to moments.

3. Automated, Real-Time Mitigation

Incidents are contained automatically, slashing response time.

Response isn’t just fast—it’s hands-free. AI-enabled automation triggers real-time containment and mitigation playbooks, stopping malware, reverting compromised changes, and restoring safety without waiting for human hands-on. This relentless automation keeps damage minimal and recovery swift.

4. Dynamic Global Defense

AI adapts defenses everywhere—no matter where threats emerge.

Threats don’t respect borders, and neither does advanced cloud security. AI learns from attacks worldwide, continuously updating defenses for your entire cloud ecosystem. Whether risks appear in one region or another, your security posture adapts globally and instantly.

5. Laser-Focused Resilience

AI fortifies your cloud for faster recovery after any attack.

Resilience is everything when the stakes are high. Even if attackers get through, AI helps you bounce back quicker—guiding recovery efforts, restoring key systems, and focusing resources to where they’re needed most. This smart resilience limits impact and builds trust in your cloud journey.

In Summary

AI is changing the rules of cloud security. By combining sharp intelligence, automated speed, global adaptivity, and purposeful resilience, these five moves ensure your defenses are always one step ahead. That’s the power—and the promise—of the AI-driven cloud.

Read More
No Cloud, No AI Agents: How cloud powers AI
Akhil Mohan Akhil Mohan

No Cloud, No AI Agents: How cloud powers AI

No Cloud, No AI Agents: How cloud powers AI. The rise of Agentic AI represents a fundamental shift in how applications operate. No longer are intelligent systems siloed, static, or limited to following pre-baked rules. Today’s most advanced agents can make decisions, learn, adapt, plan, and act autonomously—all thanks to the immense power of the cloud. But how does this new class of Agentic AI actually harness the cloud? Let’s explore, using a “mission control” diagram to unpack the anatomy of autonomous, cloud-native intelligence.

The Mission Control Center: Agentic AI at the Core

At the heart of our diagram—and this new technological era—is the Agentic AI Brain. Think of it as mission control for autonomous cloud applications. It’s where goals are set, data is interpreted, and complex decisions are made in real time.

No Cloud, No AI Agents: Cloud Powers Agentic AI

Agentic AI Brain:
Mission control for autonomous cloud applications, orchestrating every action, memory recall, plan, and collaboration.

Everything else in the agent’s ecosystem is an extension of, or a resource for, its intelligence.

1. Cloud Functions: Empowering Real-World Action

First up on our “mission control” board are Cloud Functions. These are the agent’s toolset—a collection of ready-to-use, infinitely scalable programs that can be triggered on demand to do real work:

  • Action and Automation: Cloud functions translate the agent’s decisions into action—sending emails, analyzing images, or updating records.

  • Operating Environment: Instead of being tethered to a single device or server, these tools can execute anywhere in the cloud, scaling up or down as needed.

In essence: When an agent decides, “It’s time to act,” cloud functions are the hands that make it happen.

2. APIs & Data Lakes: Memory, Knowledge, Context

Intelligent autonomy demands more than brute force—it needs a memory and a way to learn. In our diagram, that’s what APIs & Data Lakes represent:

  • Knowledge Base: APIs let the agent pull in fresh information—weather reports, user profiles, real-time market data, and more.

  • Long-Term Memory: Data lakes serve as vast repositories where the agent’s experiences, logs, and learned models can be stored and recalled.

  • Integration: The ability to connect and combine disparate data sources is what gives agentic systems true context-awareness.

In practice: This is how an agent “remembers” your preferences, adapts to new info, and generates the right answer—every time.

3. Scheduling Tools: Planning and Time Management

Autonomy isn’t just about acting now, but about knowing when (and in what order) to act. Scheduling Tools are the agent’s calendar, planner, and logistician:

  • Optimizing Tasks: They help agents schedule jobs, set reminders, balance workloads, and avoid conflicts—across users, systems, and services.

  • Coordination: Tasks can be rescheduled, chained, or repeated, allowing agents to handle complex workflows over time.

For the agent: This means turning a to-do list into a robust, ever-adaptive action plan.

4. Micro-agents: Collaboration and Specialization

The last piece is all about teamwork: Micro-agents. Instead of trying to be a jack of all trades, Agentic AI can delegate:

  • Specialized Sub-agents: Each micro-agent can focus on a particular function—one handles data cleaning, another books appointments, another negotiates with APIs.

  • Collaboration: The central agent coordinates, while micro-agents execute, report back, and even work together, dynamically forming teams as challenges arise.

The result: A cloud-native hive of intelligence, where expertise and responsibility are distributed for efficiency and resilience.

Why the Cloud is the Agent’s Perfect Home?

The entire architecture radiates outward from the agentic core, empowered at every step by the cloud:

  • Unlimited scalability: New tools, knowledge, and agents can be added seamlessly.

  • Always-on connectivity: Agents can tap global resources and operate 24/7.

  • Modularity: Each function—tools, memory, planning, collaboration—is a plug-and-play cloud service, making development and scaling simple.

Conclusion

As Agentic AI continues its rise, its strength will come not just from smarter algorithms, but from deeper integration with the cloud. The future belongs to self-directed applications—secure, scalable, and endlessly adaptive—operating from a "mission control” at the heart of the cloud.

The image you see above isn’t just a map; it’s a blueprint for the next generation of software.

Read More
Perplexity Comet AI Browser: The AI Agent That Will Change How You Work Online
Priyanka Vergadia Priyanka Vergadia

Perplexity Comet AI Browser: The AI Agent That Will Change How You Work Online

Perplexity Comet AI Browser: The AI Agent That Will Change How You Work Online

Aug 3 

Written By Priyanka Vergadia

Are you ready for a browser that does more than just display web pages? I just got early access to Perplexity's new Comet AI browser and I'm absolutely blown away by what it can do. Unlike Chrome, Comet acts as a true AI agent that can read your emails, schedule meetings, book restaurants, and manage complex workflows—all while you focus on what matters most.

What makes Comet different?

In my new video (watch below!), I show Comet creating LinkedIn posts automatically, turning a chaotic to-do list into smartly scheduled calendar events, and managing an entire video production workflow—all within one intuitive interface.

Read More
What is GitHub Spark: The Full Demo Inside
Priyanka Vergadia Priyanka Vergadia

What is GitHub Spark: The Full Demo Inside

Vibe coding on steroids with GitHub Spark. 🚀 GitHub Spark Just Changed Everything About App Development GitHub has just launched Spark, an AI-powered coding platform that turns natural language descriptions into fully functional web applications. No coding required, no setup headaches, and one-click deployment to production. This isn't just another AI coding assistant – it's a complete paradigm shift in how we build software. ⚡ What You'll Learn: What GitHub Spark is and how it works Live demonstration of building multiple apps with just natural language Why this matters for developers, designers, and entrepreneurs Honest breakdown of pricing and limitations The future of AI-powered development 🔥 Key Highlights: Vibe Coding Full-stack applications generated from plain English Integrated with Claude Sonnet 3.5, GPT-4o, and other leading AI models One-click deployment with enterprise-grade hosting Complete GitHub ecosystem integration Real-time live previews and instant iteration 💰 Pricing & Access: Currently available in public preview for GitHub Copilot Pro+ subscribers ($39/month) Includes 375 Spark messages, unlimited manual editing, hosting, and AI inference 🛠️ Perfect For: ✅ Rapid prototyping and MVP development ✅ Internal tools and personal projects ✅ Learning full-stack development concepts ✅ Non-technical founders validating ideas ✅ Experienced developers eliminating boilerplate work

Read More
Complete Beginners Guide to Hugging Face
Priyanka Vergadia Priyanka Vergadia

Complete Beginners Guide to Hugging Face

hey everyone and welcome back to my

channel where we talk about cloud tech

and AI And today we're diving into a

platform that you must know about if

you're doing anything to do with AI It

is Hugging Face Now Hugging Face has

been called the GitHub of machine

learning and for all the great reasons

It is literally becoming the community

where AI models and creations are shared

across everybody So by the end of this

video you will understand what it is

exactly and why it matters to you even

if you are not somebody who codes every

day So stick around All right So this is

Demo

the hugging face homepage And the first

thing that you'll notice in here is

their tagline which is the AI community

building the future That really sums up

what they're really about right it's a

collaborative platform where people are

sharing AI tools models data sets and

even AI apps Now if you scroll down

you're able to see features models that

are trending and recently uploaded

content Um and this gives you the taste

of what's popular in the AI community

And before we dive deeper I would

recommend that you sign up and create an

account because you would need one Uh

most of the content you'll be able to

just see um without uh having an account

but if you want to use the models and

save your favorites and things like that

you will need an account Now um going

into the models tab this is where all

the models are found You can see that

they've got millions of models in here

and you can filter them by different

tasks in the categories like natural

language processing and classification

and audio and tabular

And you could also filter them by

libraries and data sets and languages

and

licenses Now here let's check out u

Microsoft's popular um 54 reasoning

model And I wanted to see how far I can

go with this So each model has got its

own page with documentation and I

clicked on deploy and it literally just

took me right away into the machine

learning studio in Azure AI and I was

asked to create a workspace and as soon

as I gave it all the details with the

name and everything um it was able to

create that workspace for me and

deployed that model the 54 model um from

hugging face into Azure AI machine

learning studio

That was absolutely amazing I just it

just took me a few clicks to do this And

you can see it's creating that now And

once it is created I can go to the

workspace And in this workspace if I

click on endpoints if I click on

endpoints I'm able to go into Azure

OpenAI service And that's where my 54

endpoint is And if I want to use this

endpoint I can click on continue And

that takes me into Azure AI Foundry

where I'll be able to um experiment with

this with this deployed model It tells

me my target URL the key that it created

for me I'm able to see how to use this

with my API key and um some samples of

how to use this model Um I am also able

to go play with it in the playground and

test it out So I gave it a prompt Um and

um I was really trying to go with like

the dog traveling uh to the mountains

where he meets a robot and um that robot

is helping a bird um survive in the cold

um and they all become friends for life

So um I was just playing around with

with um a prompt but the idea here is

that you're able to go from looking at a

model in hugging phase to actually able

to deploy that model in Azure AI found

uh foundry and uh Azure AI machine

learning studio Um and then I clicked on

deploy that um endpoint as a web

application

And right now that is what you're seeing

um with the Azure AI web application

being created um as a part of this

deployment It's able to deploy a web

application right from that um that

model and endpoint that we just created

Once the model this takes seconds maybe

a minute or so to to get created with

the deployment assets and stuff things

like that And once the app is deployed

I'm able to see that app in Azure AI

foundry in my web apps section There it

is the 54 experiment I click on that app

and there we have it An entire chat

application built from hugging face

choosing a model 54 reasoning goes into

Azure AI machine learning studio foundry

and builds it out for me as a web

application Going back into our hugging

Walkthrough

face interface we're able to uh let's

look at the the the data sets tab Now

this tab is where all your data sets are

There are thousands of these in there

and um you can preview the samples in

there You can also filter the data sets

of through languages tasks libraries all

of that Um and then if you click on one

you're able to actually see the samples

of that of the data set um and start

using them Um and now the next thing is

one of my favorites which is the spaces

section of hugging face Now this is

where things get really really exciting

especially for non-coders So spaces is

this interactive AI application that

anyone can use right within the hugging

face browser experience And think of

them as like readytouse AI tools I

clicked on one here which is called

describe anything um in uh by created by

Nvidia And when you go into that model

again right in the browser I'm not doing

anything else I can upload an image And

uh once I do that um I can type my

description and I can get my description

for for the regions of my images This is

my dog sitting on a chair in a park And

um I selected different parts of that

image And um this model is able to this

demo is able to tell me what are in

these different parts Um I selected the

tree first and then I selected my dog

himself and uh it was able to do a

really good job at telling me what is in

this image Um and if I wanted I can take

this space and deploy it for myself

whether locally or um or in cloud Um and

but before we do that let's look at

another example So I go back into my

spaces I can really um you know

categorize by image generation 3D

modeling all the different options up

top Um and I went into stable diffusion

which is another one of the very common

and very popular libraries in gener of

uh image generation models And um I

tested this one out right here in spaces

with a prompt serene lake at sunset with

mountains in the background and a golden

retriever watching the sunset I let it

generate the image And there we have it

Um I don't know if I like the first one

The second one's okay

Um but it it did what I wanted it to do

Um and let's say I'm happy with with

what it's I love the third and the

fourth images Um they really do what I

asked it to do The the good part the

best part the part that I want to show

you is I can run this space Let's say I

like it I can run it locally I can run

it um I can clone the repo um and um and

start working with it right from here

just like how we deployed the um the

five for model in Azure AI and uh with

that um let's look at the docs the docs

section is uh your knowledge center this

is where you are going to get deep

deeper technical information the docs

are organized by different categories

like the client libraries deployment

interface core ML libraries like the

transformers which is one of the very

famous libraries diffusers tokenizers um

and a lot more like radio Um and then

the next thing the last thing I want to

talk to you about is the community

section This is where people ask

questions and share ideas and learn The

blog part of the of the community is

amazing you'll see a lot of people

contributing to the blogs and you'll see

um what's happening right now um and and

what's hot right now Then the learn

section is one of my favorites The LLM

course and the agent course are some of

the best courses out there on AI and

machine learning right now The LLM

course goes from transformers all the

way up to fine-tuning And then the agent

course covers everything from intro to

agents to to a lot more So that my

friends was hugging face and we've

toured every major section of the

platform Whether you are just curious

about AI want to use existing models or

are developing something with AI or want

to contribute hugging face is definitely

a platform to check out Now go explore

And if you liked this video and found it

helpful please hit that like and

subscribe button to get more tech and AI

content And drop a comment if you have

questions and which AI platform I should

cover next And thank you for watching

See you next timehey everyone and welcome back to my

channel where we talk about cloud tech

and AI And today we're diving into a

platform that you must know about if

you're doing anything to do with AI It

is Hugging Face Now Hugging Face has

been called the GitHub of machine

learning and for all the great reasons

It is literally becoming the community

where AI models and creations are shared

across everybody So by the end of this

video you will understand what it is

exactly and why it matters to you even

if you are not somebody who codes every

day So stick around All right So this is

Demo

the hugging face homepage And the first

thing that you'll notice in here is

their tagline which is the AI community

building the future That really sums up

what they're really about right it's a

collaborative platform where people are

sharing AI tools models data sets and

even AI apps Now if you scroll down

you're able to see features models that

are trending and recently uploaded

content Um and this gives you the taste

of what's popular in the AI community

And before we dive deeper I would

recommend that you sign up and create an

account because you would need one Uh

most of the content you'll be able to

just see um without uh having an account

but if you want to use the models and

save your favorites and things like that

you will need an account Now um going

into the models tab this is where all

the models are found You can see that

they've got millions of models in here

and you can filter them by different

tasks in the categories like natural

language processing and classification

and audio and tabular

And you could also filter them by

libraries and data sets and languages

and

licenses Now here let's check out u

Microsoft's popular um 54 reasoning

model And I wanted to see how far I can

go with this So each model has got its

own page with documentation and I

clicked on deploy and it literally just

took me right away into the machine

learning studio in Azure AI and I was

asked to create a workspace and as soon

as I gave it all the details with the

name and everything um it was able to

create that workspace for me and

deployed that model the 54 model um from

hugging face into Azure AI machine

learning studio

That was absolutely amazing I just it

just took me a few clicks to do this And

you can see it's creating that now And

once it is created I can go to the

workspace And in this workspace if I

click on endpoints if I click on

endpoints I'm able to go into Azure

OpenAI service And that's where my 54

endpoint is And if I want to use this

endpoint I can click on continue And

that takes me into Azure AI Foundry

where I'll be able to um experiment with

this with this deployed model It tells

me my target URL the key that it created

for me I'm able to see how to use this

with my API key and um some samples of

how to use this model Um I am also able

to go play with it in the playground and

test it out So I gave it a prompt Um and

um I was really trying to go with like

the dog traveling uh to the mountains

where he meets a robot and um that robot

is helping a bird um survive in the cold

um and they all become friends for life

So um I was just playing around with

with um a prompt but the idea here is

that you're able to go from looking at a

model in hugging phase to actually able

to deploy that model in Azure AI found

uh foundry and uh Azure AI machine

learning studio Um and then I clicked on

deploy that um endpoint as a web

application

And right now that is what you're seeing

um with the Azure AI web application

being created um as a part of this

deployment It's able to deploy a web

application right from that um that

model and endpoint that we just created

Once the model this takes seconds maybe

a minute or so to to get created with

the deployment assets and stuff things

like that And once the app is deployed

I'm able to see that app in Azure AI

foundry in my web apps section There it

is the 54 experiment I click on that app

and there we have it An entire chat

application built from hugging face

choosing a model 54 reasoning goes into

Azure AI machine learning studio foundry

and builds it out for me as a web

application Going back into our hugging

Walkthrough

face interface we're able to uh let's

look at the the the data sets tab Now

this tab is where all your data sets are

There are thousands of these in there

and um you can preview the samples in

there You can also filter the data sets

of through languages tasks libraries all

of that Um and then if you click on one

you're able to actually see the samples

of that of the data set um and start

using them Um and now the next thing is

one of my favorites which is the spaces

section of hugging face Now this is

where things get really really exciting

especially for non-coders So spaces is

this interactive AI application that

anyone can use right within the hugging

face browser experience And think of

them as like readytouse AI tools I

clicked on one here which is called

describe anything um in uh by created by

Nvidia And when you go into that model

again right in the browser I'm not doing

anything else I can upload an image And

uh once I do that um I can type my

description and I can get my description

for for the regions of my images This is

my dog sitting on a chair in a park And

um I selected different parts of that

image And um this model is able to this

demo is able to tell me what are in

these different parts Um I selected the

tree first and then I selected my dog

himself and uh it was able to do a

really good job at telling me what is in

this image Um and if I wanted I can take

this space and deploy it for myself

whether locally or um or in cloud Um and

but before we do that let's look at

another example So I go back into my

spaces I can really um you know

categorize by image generation 3D

modeling all the different options up

top Um and I went into stable diffusion

which is another one of the very common

and very popular libraries in gener of

uh image generation models And um I

tested this one out right here in spaces

with a prompt serene lake at sunset with

mountains in the background and a golden

retriever watching the sunset I let it

generate the image And there we have it

Um I don't know if I like the first one

The second one's okay

Um but it it did what I wanted it to do

Um and let's say I'm happy with with

what it's I love the third and the

fourth images Um they really do what I

asked it to do The the good part the

best part the part that I want to show

you is I can run this space Let's say I

like it I can run it locally I can run

it um I can clone the repo um and um and

start working with it right from here

just like how we deployed the um the

five for model in Azure AI and uh with

that um let's look at the docs the docs

section is uh your knowledge center this

is where you are going to get deep

deeper technical information the docs

are organized by different categories

like the client libraries deployment

interface core ML libraries like the

transformers which is one of the very

famous libraries diffusers tokenizers um

and a lot more like radio Um and then

the next thing the last thing I want to

talk to you about is the community

section This is where people ask

questions and share ideas and learn The

blog part of the of the community is

amazing you'll see a lot of people

contributing to the blogs and you'll see

um what's happening right now um and and

what's hot right now Then the learn

section is one of my favorites The LLM

course and the agent course are some of

the best courses out there on AI and

machine learning right now The LLM

course goes from transformers all the

way up to fine-tuning And then the agent

course covers everything from intro to

agents to to a lot more So that my

friends was hugging face and we've

toured every major section of the

platform Whether you are just curious

about AI want to use existing models or

are developing something with AI or want

to contribute hugging face is definitely

a platform to check out Now go explore

And if you liked this video and found it

helpful please hit that like and

subscribe button to get more tech and AI

content And drop a comment if you have

questions and which AI platform I should

cover next And thank you for watching

Read More
What is Synthetic Data and how to use it effectively in your AI Projects
Priyanka Vergadia Priyanka Vergadia

What is Synthetic Data and how to use it effectively in your AI Projects

Researchers predict we'll exhaust all fresh text data on the internet in less than 30 years. This looming "data cliff" is why synthetic data is becoming the secret sauce of AI development—our escape hatch from running out of training material.

If you're working with AI systems or curious about how modern language models are trained, understanding synthetic data isn't just helpful—it's becoming essential. Let's dive into what it is, how it works, and why it might be the key to AI's future.

Read More
C-Suite’s Guide to Building Lasting ROI with AI Investments
Priyanka Vergadia Priyanka Vergadia

C-Suite’s Guide to Building Lasting ROI with AI Investments

How to build lasting ROI on AI investments: Every meeting I am in, the customer executives are asking: “What’s the ROI on my AI project?” The honest answer I have to share is: you won’t see it in a few days—or even a few months. That’s because AI, unlike a traditional technology rollout, is not a one-off project. It’s a habit. And like any habit, it takes time, commitment, and cultural change to form—before the real value emerges.

Traditional project approaches frame AI as a one-time initiative with defined start and end points, typically measured in weeks or months. In contrast, the habit approach recognizes AI as an ongoing process of integration into daily workflows that spans months to years. Research on habit formation indicates that individuals require an average of 66 days to form basic habits, with a range of 18-254 days depending on complexity. For organizations, this timeline extends considerably longer—typically 120 days for organizational changes and up to 365 days for full AI integration.

Read More
AI Created It, But Who Owns It?
Abhishek Sharma Abhishek Sharma

AI Created It, But Who Owns It?

Navigating the complexities of AI copyright can feel like stepping into a legal minefield. As generative AI tools become essential for creators, artists, and businesses, the question of ownership is more critical than ever. Who holds the intellectual property rights to AI-generated content? Is it you, the AI developer, or does it belong to the public domain? Our latest post demystifies the current state of AI copyright, exploring key court cases, the debate around "authorship," and the crucial concept of "fair use." Get the clarity you need to create and innovate with confidence.

Read More
OWASP Top 10 for LLMs and GenAI Cheatsheet
Priyanka Vergadia Priyanka Vergadia

OWASP Top 10 for LLMs and GenAI Cheatsheet

The OWASP Top 10 for Large Language Models represents the most critical security risks facing AI applications in 2025. As LLMs become increasingly embedded in applications across industries, understanding and mitigating these risks is crucial for developers and security professionals. In this article let’s go over an AI application architecture covering each of the OWASP Top 10 for LLMs and understand the prevention methods for each.

Read More
What is Model Context Protocol (MCP)?
Priyanka Vergadia Priyanka Vergadia

What is Model Context Protocol (MCP)?

To understand Model Context Protocol (MCP), let's start with a familiar concept: APIs in web applications.

Before APIs became standardized, web developers faced a significant challenge. Each time they needed to connect their application to an external service—whether a payment processor, social media platform, or weather service—they had to write custom code for that specific integration. This created a fragmented ecosystem where:

  • Developers spent excessive time building and maintaining custom connectors

  • Each connection had its own implementation details and quirks

  • Adding new services required significant development effort

  • Maintaining compatibility as services evolved was labor-intensive

APIs (Application Programming Interfaces) solved this problem by establishing standardized ways for web applications to communicate with external services. With standardized APIs:

  • Developers could follow consistent patterns to integrate services

  • Documentation became more standardized and accessible

  • Updates to services were easier to accommodate

  • New integrations became significantly faster to implement

MCP addresses the exact same problem, but for AI applications.

Just as APIs standardized how web applications connect to backend services, MCP standardizes how AI applications connect to external tools and data sources. Without MCP, AI developers face the same fragmentation problem that web developers faced before standardized APIs—they must create custom connections for each external system their AI needs to access.

What is MCP?

Model Context Protocol (MCP) is an open protocol developed by Anthropic that enables seamless integration between AI applications/agents and various tools and data sources. Think of it as a universal translator that allows AI systems to communicate with different external tools without needing custom code for each connection.

Read More
Latest Large Context Model (LCM) Benchmark Explained: L-CiteEval
Priyanka Vergadia Priyanka Vergadia

Latest Large Context Model (LCM) Benchmark Explained: L-CiteEval

Latest Large Context Model (LCM) Benchmark Explained: L-CiteEval

Dec 27 

Written By Priyanka Vergadia

As language models continue to evolve, one of the most significant challenges has been handling long-form content effectively. In this article, we'll explore how modern Large Context Models (LCMs) are pushing the boundaries of context windows and what this means for developers working with AI applications.

The Evolution of Context Windows

The landscape of context windows in language models has evolved dramatically:

  • GPT-3.5 (2022): 4K tokens

  • Claude 2 (2023): 100K tokens

  • GPT-4 (2024): 128K tokens

  • Claude 3 (2024): 200K tokens

  • Gemini Ultra (2024): 1M tokens

  • Anthropic Claude (experimental): 1M tokens

This exponential growth in context window sizes represents a fundamental shift in how we can interact with AI systems. For perspective, 1M tokens is roughly equivalent to 750,000 words or about 3,000 pages of text.

We’ll explore and understand LCM with the help of a recent research paper (L-CITEEVAL: DO LONG-CONTEXT MODELS TRULY LEVERAGE CONTEXT FOR RESPONDING?), which highlights the importance of large context with benchmark.

Read More
ONE life change I wish I'd made sooner in my tech career
Priyanka Vergadia Priyanka Vergadia

ONE life change I wish I'd made sooner in my tech career

There's a lot I accomplished professionally this year – from an amazing job change to completing 4 more terms of my MBA, learning countless new things, and exploring more of the world. While I share my professional journey on LinkedIn, this blog post is different. It's raw, even a bit vulnerable. I'm sharing this story hoping it might inspire even one person. This year taught me a profound truth: health truly is wealth. Here's my journey. Read on..

Read More
RAG Cheatsheet
Priyanka Vergadia Priyanka Vergadia

RAG Cheatsheet

Ever wondered why sometimes you get misleading answers from generic LLMs? It's like trying to get directions from a confused stranger, right? This can happen for many reasons, some of them are that the LLM is trained on data that is out of date, it cannot do the math or the calculations or it is just hallucinated. That is where RAG comes in.

Read More
What is Agentic RAG? Simplest explanation
Priyanka Vergadia Priyanka Vergadia

What is Agentic RAG? Simplest explanation

Traditional RAG systems, while foundational, often operate like a basic librarian - they fetch relevant documents and generate responses based on them. Agentic RAG, on the other hand, operates more like a research team with specialized experts. Let's dive deep into when and why you'd choose one over the other.

Read More
Top 11 AI Coding Assistants in 2024
Priyanka Vergadia Priyanka Vergadia

Top 11 AI Coding Assistants in 2024

As a software developer in 2024, you've probably noticed that AI has fundamentally transformed the way we write code. Gone are the days of endlessly googling syntax or scrolling through Stack Overflow for basic implementations. AI coding assistants have emerged as indispensable tools in a developer's arsenal, promising to boost productivity and streamline the coding process.

But with so many options flooding the market, choosing the right AI coding assistant can feel overwhelming. Should you go with the popular GitHub Copilot, or explore newer alternatives? Is the free tier sufficient for your needs, or should you invest in a premium solution?

This blog is my attempt to explore the current landscape of AI coding assistants, helping you make an informed decision based on your specific needs and circumstances. I will say there are many more AI coding assistants out there, I am only covering a few more well known ones here.

Read More
How Cloudflare Stopped the Largest DDoS Attack in History in 2024
Priyanka Vergadia Priyanka Vergadia

How Cloudflare Stopped the Largest DDoS Attack in History in 2024

Two weeks ago something huge happened in tech! Cloudflare, cloud platform that offers DNS and DDoS protections service, auto mitigated a 3.8 Tbps DDoS attack. To put that in perspective, imagine downloading 950 HD movies... every single second. That's the kind of digital tsunami Cloudflare was up against.  Let’s demystify what goes into mitigating an attack of this magnitude. Before we understand that, let me start by sharing how DDoS attacks work. 

Read More