Building AI Agents with Java and Semantic Kernel
Cloud CommuteAugust 23, 2024x
26
00:30:1827.74 MB

Building AI Agents with Java and Semantic Kernel

In this episode of Simplyblock's Cloud Commute Podcast, Bruno Borges, Principal Product Manager at Microsoft, discusses millions of Java instances at Microsoft and the role of the Microsoft Java team. He introduces Semantic Kernel, a Java library for integrating AI into business applications, using large language models. Tune in to learn how Microsoft is shaping the future of development and AI integration.

In this episode of Cloud Commute, Chris and Bruno discuss:

  • Java’s evolution at Microsoft and OpenJDK initiatives
  • AI integration and the role of the Semantic Kernel in intelligent applications
  • The rise of remote development environments
  • The future of AI and LLMs in software development

Interested to learn more about the cloud infrastructure stack like storage, security, and Kubernetes? Head to our website (www.simplyblock.io/cloud-commute-podcast) for more episodes, and follow us on LinkedIn (www.linkedin.com/company/simplyblock-io/). You can also check out the detailed show notes on Youtube (www.youtube.com/watch?v=GrS8LmPVolE).

You can find Bruno Borges on X @brunoborges and Linkedin: /brunocborges.

About simplyblock:

Simplyblock is an intelligent database storage orchestrator for IO-intensive workloads in Kubernetes, including databases and analytics solutions. It uses smart NVMe caching to speed up read I/O latency and queries. A single system connects local NVMe disks, GP3 volumes, and S3 making it easier to handle storage capacity and performance. With the benefits of thin provisioning, storage tiering, and volume pooling, your database workloads get better performance at lower cost without changes to existing AWS infrastructure.

👉 Get started with simplyblock: https://www.simplyblock.io/buy-now

🏪 simplyblock AWS Marketplace: https://aws.amazon.com/marketplace/seller-profile?id=seller-fzdtuccq3edzm


00:00:01
I was running three

00:00:02
different Java binaries

00:00:03
of Microsoft Build

00:00:04
of OpenJDK, one for

00:00:05
Windows, one for

00:00:06
Apple Silicon, and

00:00:07
one for Mac Intel.

00:00:08
You know, just to show

00:00:09
that we want every

00:00:10
developer, no matter

00:00:11
where they are, to be

00:00:13
able to take advantage

00:00:14
of all the technology

00:00:14
that we produce.

00:00:15
I actually compiled an

00:00:16
embedded Linux kernel

00:00:18
in WSL2, wrote it right

00:00:20
to our embedded system.

00:00:21
Embedded Linux built

00:00:22
inside of Docker,

00:00:23
inside of WSL2, inside

00:00:24
of Windows, which

00:00:25
was kind of freaky.

00:00:27
But also pretty cool.

00:00:30
We humans are like that.

00:00:31
We tend to search for

00:00:32
things when we have

00:00:33
a problem to solve.

00:00:35
We tend to learn when we

00:00:37
have the need to learn.

00:00:38
If we can learn

00:00:39
faster and AI and LLMs

00:00:41
allow us to do that,

00:00:43
then we definitely

00:00:43
should leverage that.

00:00:47
You're listening to

00:00:47
Simplyblock's Cloud

00:00:48
Commute Podcast,

00:00:49
your weekly 20

00:00:50
minute podcast about

00:00:51
cloud technologies,

00:00:52
Kubernetes, security,

00:00:54
sustainability,

00:00:54
and more.

00:00:57
Hello everyone.

00:00:58
Welcome back to this

00:00:59
week's episode of

00:01:00
Simplyblock's Cloud

00:01:00
Commute Podcast.

00:01:01
This day, this week,

00:01:04
you know, I have another

00:01:05
incredible guest.

00:01:06
I say that every

00:01:07
time, it's every,

00:01:07
it's true every time.

00:01:09
This time, somebody

00:01:10
I actually know

00:01:11
for a very, very,

00:01:12
very long time.

00:01:14
He's been with Oracle

00:01:16
for many, many years.

00:01:17
He's for, with Microsoft

00:01:19
for also quite a

00:01:20
few years so far.

00:01:23
Bruno, welcome

00:01:24
to the show.

00:01:26
Hey Chris, thanks

00:01:27
for having me.

00:01:28
Yeah, and you're,

00:01:29
you're right.

00:01:30
I've been at Oracle

00:01:31
for about five

00:01:33
years and a half.

00:01:34
And then, I was

00:01:37
at Oracle for

00:01:37
about that time.

00:01:38
And then now I've

00:01:39
been with Microsoft

00:01:40
since 2018.

00:01:42
So you're almost,

00:01:43
well Six years now.

00:01:46
Yeah.

00:01:47
It's been-

00:01:48
I've been at Microsoft

00:01:49
more time than I

00:01:50
was at Oracle now,

00:01:51
just completing

00:01:52
my extra year.

00:01:54
So, I can say that I

00:01:57
did my dream of working

00:01:59
as a job engineer for

00:02:00
the company that does

00:02:01
Java and now I'm at the

00:02:04
company that does the

00:02:06
alternative to Java, but

00:02:08
I'm on the Java team.

00:02:09
So, we can talk

00:02:11
about interesting

00:02:12
things there.

00:02:13
All right.

00:02:14
As long as Microsoft

00:02:15
is never going to

00:02:16
go back to J sharp.

00:02:17
I think we're all

00:02:18
good with that.

00:02:19
No, no, no, no, no, no.

00:02:22
When we announced the

00:02:24
Microsoft Build of

00:02:25
OpenJDK, we had lots

00:02:26
of jokes about that.

00:02:28
We thought about

00:02:28
doing a prank.

00:02:30
April Fool's joke.

00:02:32
We thought about

00:02:32
announcing Microsoft

00:02:33
Build of OpenJDK

00:02:34
on April 1st.

00:02:38
And we didn't do

00:02:39
anything like that,

00:02:41
but one thing that

00:02:42
did happen when we

00:02:44
announced Microsoft

00:02:45
Build of OpenJDK was

00:02:47
that an anonymous

00:02:49
group of Microsoft

00:02:51
employees, and some

00:02:52
of them ex employees,

00:02:54
we don't know who

00:02:55
are these people, but

00:02:58
honestly, I don't know

00:03:00
who are these people,

00:03:01
but, of course, they saw

00:03:02
the news that Microsoft

00:03:05
announced Microsoft

00:03:06
Build of OpenJDK, or

00:03:09
that were, we were

00:03:10
about to announce

00:03:11
that, they published

00:03:13
a blog post called

00:03:16
the Microsoft Coffee.

00:03:18
And it was a prank that

00:03:20
they did back in the

00:03:21
days, where Microsoft,

00:03:24
where these guys

00:03:25
faked software boxes.

00:03:27
You remember when

00:03:28
you used to go to

00:03:29
a library store, a

00:03:30
bookstore, and you can

00:03:31
actually buy software?

00:03:32
Damn.

00:03:33
It's in this

00:03:34
round, mirror-ish

00:03:36
device called CD.

00:03:38
And we would actually

00:03:40
put inside the computer

00:03:42
and install software.

00:03:44
They did this prank

00:03:45
back in the days where

00:03:47
they printed software

00:03:48
boxes with the logo

00:03:50
Microsoft Coffee.

00:03:52
And it was a prank

00:03:55
on Java at the time.

00:03:58
And they actually put

00:04:01
dozens of boxes across

00:04:03
different stores in the

00:04:04
Seattle area in the US.

00:04:07
And the local press

00:04:10
saw that and they

00:04:12
said, Microsoft is

00:04:13
announcing a software.

00:04:16
And basically the

00:04:18
prank went too far.

00:04:20
Right.

00:04:20
Right.

00:04:21
So, sorry for, for,

00:04:23
for the audience.

00:04:25
Obviously Java

00:04:26
is an island.

00:04:27
It's a coffee.

00:04:28
It's a programming

00:04:29
language.

00:04:29
So it's a running gig.

00:04:31
Just in case

00:04:32
you don't know.

00:04:33
Now, you know.

00:04:33
Yeah, exactly.

00:04:36
So anyhow, if you go, if

00:04:39
you search for Microsoft

00:04:40
Coffee on Google

00:04:41
or Bing, whatever,

00:04:43
whatever you choose,

00:04:45
you're going to learn

00:04:45
about this thing.

00:04:46
It's, there's also

00:04:47
an account on Twitter

00:04:48
called Twitter.com or

00:04:49
X.com/MicrosoftCoffee.

00:04:52
And you're going to

00:04:53
see the blog post

00:04:54
that they published

00:04:55
at the same time that

00:04:56
we announced Microsoft

00:04:57
Build of OpenJDK,

00:04:59
telling the story,

00:05:00
the behind the scenes

00:05:01
story about that prank.

00:05:05
So, there's that.

00:05:06
And it's an interesting

00:05:07
fact that folks

00:05:08
would, would love

00:05:10
to learn about.

00:05:10
Yeah.

00:05:10
Yeah.

00:05:11
I'll try to find that.

00:05:12
If you, if you

00:05:13
have the links,

00:05:13
just send it to me.

00:05:14
I'll drop it in the

00:05:15
show notes because

00:05:16
that sounds amazing.

00:05:17
I did not know

00:05:18
about that.

00:05:19
There is a YouTube

00:05:21
video, showing the

00:05:23
news anchor talking

00:05:26
about the new product,

00:05:27
like it's a prank.

00:05:30
Anyways.

00:05:31
Yeah.

00:05:31
I can't-

00:05:32
To be honest, with a

00:05:34
company like Microsoft,

00:05:35
I can totally see

00:05:36
how this like, goes

00:05:37
way over the top.

00:05:39
Microsoft, they used

00:05:41
to do lots of pranks

00:05:42
back in the days, you

00:05:43
know, like Bill Gates

00:05:44
and Steve Ballmer

00:05:45
video recording, like

00:05:46
dancing, you know.

00:05:47
I mean, everyone knows

00:05:49
that Bill is famous

00:05:51
for his stupid jokes.

00:05:53
So, yeah.

00:05:53
Yeah.

00:05:54
You know, like, but now

00:05:56
we do like AI, AI, AI.

00:05:58
So you're still

00:05:59
doing stupid jokes.

00:06:00
Is that what

00:06:00
you're saying?

00:06:01
Yeah, basically.

00:06:02
Yeah.

00:06:02
Okay.

00:06:03
Okay.

00:06:03
Fair enough.

00:06:03
That's-

00:06:04
I mean, but we do, we do

00:06:07
jokes that are actually,

00:06:08
you know, funny.

00:06:10
All right.

00:06:13
Let's, let's get back to

00:06:14
the AI topic in a few.

00:06:17
Let us know a little

00:06:18
bit more about yourself.

00:06:19
To be honest, I was

00:06:21
kind of surprised.

00:06:22
I thought you've been

00:06:23
with Oracle much longer.

00:06:25
So that was,

00:06:26
that was quite-

00:06:26
Yeah, I joined, I joined

00:06:27
Oracle in 2012 and then

00:06:29
I left in January 2018.

00:06:32
That's when I basically

00:06:33
joined Microsoft.

00:06:34
And then, now

00:06:35
six years later.

00:06:37
So it was, it was an

00:06:39
interesting transition,

00:06:40
you know, like coming to

00:06:43
Microsoft to help them

00:06:44
talk to Java developers

00:06:47
about Azure, about

00:06:48
cloud, about Microsoft

00:06:50
developer tools like

00:06:51
Visual Studio Code.

00:06:52
And now GitHub after

00:06:54
the acquisition.

00:06:55
And then, one of

00:06:57
the interesting things

00:06:58
that happened in the

00:07:00
early days was that I

00:07:02
found out the amount

00:07:04
of Java that runs

00:07:05
inside Microsoft.

00:07:08
And it's not as big,

00:07:10
of course, as many

00:07:12
other companies like

00:07:12
Amazon or even Google,

00:07:14
perhaps, and of course,

00:07:17
Netflix and the likes

00:07:20
of that, the amount of

00:07:21
Java, like even Twitter

00:07:22
has a lot of Java.

00:07:24
But Microsoft runs,

00:07:26
our latest estimates,

00:07:28
it's somewhere between

00:07:29
2.5 to 3 million JVMs,

00:07:32
running in production

00:07:33
inside Microsoft for

00:07:35
internal systems,

00:07:36
not just, not, not,

00:07:37
does not count Azure

00:07:40
customer workloads,

00:07:42
just the internal stuff.

00:07:43
And we like to say- yep?

00:07:45
And that is all

00:07:46
on, on Azure or

00:07:47
how does that work?

00:07:48
Or is it also like

00:07:49
internal stuff?

00:07:50
No, internal, internal,

00:07:51
internal stuff.

00:07:52
Some stuff is running.

00:07:54
Some Microsoft internal

00:07:55
stuff runs on Azure data

00:07:56
centers, some stuff run

00:07:58
on Microsoft proprietary

00:08:00
data centers, like

00:08:01
Azure is proprietary,

00:08:02
but not inside

00:08:03
Azure data centers.

00:08:05
LinkedIn, for example,

00:08:07
is entirely Java.

00:08:08
The backend is 98,

00:08:10
99 percent Java and

00:08:12
hundreds of thousands of

00:08:13
JVMs run for LinkedIn.

00:08:17
Then we have

00:08:18
Minecraft servers.

00:08:19
We have Bing, the

00:08:22
search engine.

00:08:23
We have Azure Control

00:08:26
Plane, which is another-

00:08:27
It's an internal

00:08:28
technology that helps

00:08:30
us orchestrate the

00:08:31
Azure data centers.

00:08:33
Mm-Hmm.

00:08:34
We do a lot

00:08:34
of messaging.

00:08:35
We like to say

00:08:36
two things.

00:08:37
One, if you click

00:08:38
on your Start Menu

00:08:39
on Windows, that's

00:08:40
powered by Java.

00:08:41
How does that, what

00:08:42
does that mean?

00:08:43
It means that when you

00:08:44
click on Start Menu on

00:08:45
Windows, you see some

00:08:48
of Bing recommendations,

00:08:49
Bing search events,

00:08:51
news, whatever, right?

00:08:52
All of that is

00:08:53
coming from Bing.

00:08:54
And Bing has a lot

00:08:55
of indexes technology

00:08:57
based in Java.

00:08:59
And because of that,

00:09:00
we make that joke

00:09:01
that, you know, the

00:09:02
Start Menu of Windows

00:09:03
is powered by Java.

00:09:05
The second thing

00:09:07
is Azure.

00:09:10
So when you go on

00:09:11
Azure portal and you

00:09:12
create a resource,

00:09:14
let's say you want to

00:09:14
create a database, like

00:09:15
a Postgres database.

00:09:19
The Azure portal

00:09:20
actually sends a message

00:09:21
to Control Plane and

00:09:23
then Control Plane

00:09:24
will send a message

00:09:25
to whatever data

00:09:26
center you asked that

00:09:28
resource to be created.

00:09:31
All the messaging

00:09:32
system is done by a

00:09:33
PubSub scheme that is

00:09:34
implemented in Java.

00:09:36
Okay.

00:09:36
It's not JMS

00:09:37
or anything.

00:09:38
It's-

00:09:38
No, no, there's a lot

00:09:40
of, there's a lot of,

00:09:41
there's a lot of Kafka.

00:09:42
There's a lot of Hadoop.

00:09:43
There's a lot of PDO.

00:09:45
There's a lot of Spark.

00:09:47
And there's a lot of

00:09:49
Scala as well because

00:09:50
of the big data stuff.

00:09:53
There's a lot of

00:09:54
Android code, Java

00:09:57
for Android or Kotlin.

00:10:00
Yeah, I mean, it's

00:10:02
massive, you know.

00:10:03
At the time that we-

00:10:04
that Log4J

00:10:06
vulnerability happened,

00:10:09
I got a phone call

00:10:11
on my team's like,

00:10:12
Hey, Bruno.

00:10:14
Do you know where

00:10:14
Log4J is used

00:10:15
across Microsoft?

00:10:16
I was like, I don't

00:10:17
know, probably

00:10:18
everywhere.

00:10:19
Right?

00:10:21
So we went digging

00:10:22
and we found hundreds

00:10:24
and hundreds of

00:10:25
projects inside

00:10:26
Microsoft using Log4J.

00:10:28
Yeah.

00:10:28
And we went, we went

00:10:29
ahead of course,

00:10:30
and cleaned up

00:10:30
everything, helped

00:10:32
customers on Azure

00:10:33
to mitigate as well.

00:10:36
But it was an

00:10:36
interesting thing

00:10:38
where people are

00:10:38
like, Oh, we did not

00:10:40
know we had Log4J.

00:10:41
We didn't even

00:10:41
know we had Java.

00:10:42
And yeah, we do.

00:10:43
And, and it was, it

00:10:44
was one of the biggest

00:10:45
problems, right?

00:10:46
A lot of people didn't

00:10:47
know they have it

00:10:48
as like a transitive

00:10:49
dependency or anything.

00:10:50
Oh, yeah.

00:10:51
For the audience, if you

00:10:52
don't know what Log4J

00:10:53
is, we, by the way, had

00:10:54
an episode with Brian

00:10:55
Vermeer, just like a

00:10:57
few months ago talking

00:10:58
about mostly Log4J.

00:11:02
I'll put it into

00:11:03
the show notes.

00:11:05
Yeah.

00:11:06
So I mean, Microsoft

00:11:08
changed quite a bit

00:11:09
over the last couple of,

00:11:11
well, let's say the last

00:11:12
decade, maybe a decade

00:11:13
and a half or something.

00:11:14
Yeah, the last, the last

00:11:15
15 years have been very

00:11:16
different for Microsoft.

00:11:18
I think so, right?

00:11:19
And Microsoft, well,

00:11:21
you're, you're the best,

00:11:23
best person to say they

00:11:25
changed quite a bit in

00:11:26
terms of Java because-

00:11:27
except for J sharp.

00:11:29
After that, Microsoft

00:11:30
didn't really touch

00:11:30
Java at all, but

00:11:32
they came back.

00:11:33
But same for for

00:11:34
Linux, right?

00:11:35
I think the Azure-

00:11:36
Microsoft creating Azure

00:11:39
and make, trying to make

00:11:40
it big, which I think

00:11:41
they did pretty well,

00:11:44
changed quite a bit.

00:11:47
Would you say that

00:11:49
you actually see that

00:11:51
like still happening?

00:11:53
Is there still

00:11:53
changes happening?

00:11:54
Like how that Microsoft

00:11:56
works with open source

00:11:56
communities or stuff?

00:11:58
Yeah, changes are

00:11:59
happening all the time.

00:12:00
When we, when we

00:12:02
published Microsoft

00:12:02
Build of OpenJDK,

00:12:03
it was yet another

00:12:06
transformation, right?

00:12:07
You know, Microsoft

00:12:10
having its own JDK.

00:12:11
I mean, how, how

00:12:12
crazy is that?

00:12:14
And the second thing

00:12:15
is Java support on

00:12:16
Visual Studio Code.

00:12:19
It's amazing how

00:12:21
far it has come.

00:12:23
And thanks to the

00:12:24
partnership with

00:12:25
Red Hat, I must say.

00:12:27
Red Hat drives the

00:12:29
CoreX language support

00:12:30
extension, and then

00:12:31
Microsoft provides all

00:12:32
the extensions around

00:12:34
it like Maven support,

00:12:35
JUnit support, Project

00:12:37
support, and so on.

00:12:41
And now with the

00:12:42
integration with GitHub

00:12:43
and GitHub Copilot,

00:12:44
GitHub Codespaces, and

00:12:46
all of that cool stuff,

00:12:48
it shows really like,

00:12:49
how the company and

00:12:52
its subsidiaries think

00:12:53
about the developer.

00:12:55
And when we think

00:12:56
about the developer,

00:12:57
it's not just the C

00:12:58
sharp developer or the

00:12:59
JavaScript developer,

00:12:59
we also think about

00:13:00
the Java developer.

00:13:02
And that in the

00:13:03
past would not

00:13:04
have been possible.

00:13:05
I think, I think the

00:13:06
Microsoft from the

00:13:07
past would probably say

00:13:08
something like we don't

00:13:09
care about any other

00:13:10
language developer.

00:13:11
We only care about

00:13:12
Windows developers.

00:13:13
And Visual Basic

00:13:15
developers and-

00:13:16
C Sharp, come on.

00:13:17
And C sharp and .NET

00:13:18
developers, right?

00:13:19
That's what

00:13:20
we care about.

00:13:20
Right.

00:13:21
But no, I mean, the

00:13:23
facts today show

00:13:24
that, we care

00:13:27
about any developer.

00:13:29
And you might think,

00:13:30
wow, that's obvious,

00:13:32
Bruno, because

00:13:32
Microsoft wants to grow

00:13:33
its cloud business.

00:13:34
Well, in part, yes.

00:13:36
But in also because

00:13:38
we want truly to make

00:13:40
every service and every

00:13:41
tool, every project we

00:13:43
produce to be able to be

00:13:45
used by anyone, right?

00:13:47
It's not an

00:13:48
allotest tool set.

00:13:51
It's not an allotest

00:13:52
or, you know, niche

00:13:54
stack of technologies.

00:13:56
We really want anybody

00:13:58
to be able to use that.

00:13:59
So that they can

00:14:00
be as productive as

00:14:01
possible, as efficient

00:14:03
as possible, deliver

00:14:04
with quality, with

00:14:05
speed and everything.

00:14:06
And if that

00:14:08
includes Java

00:14:09
developers, so be it.

00:14:10
We're going to make

00:14:10
sure that everything

00:14:11
that we produce is

00:14:12
equally as good for

00:14:13
Java developers.

00:14:14
I think you're making

00:14:15
an important point here.

00:14:16
I mean, you mentioned

00:14:17
Visual Studio or Visual

00:14:19
Studio Code, sorry,

00:14:20
Visual Studio Code,

00:14:21
which, I think, I don't

00:14:23
know if it's like the

00:14:25
biggest or at least

00:14:26
the second biggest

00:14:27
IDE by now just like

00:14:29
from, from zero to 100.

00:14:32
But also from a

00:14:33
developer's perspective,

00:14:35
WSL or especially

00:14:36
WSL2 made like a

00:14:38
massive leap forward.

00:14:40
Oh, yeah.

00:14:41
I use Windows subsystem

00:14:44
for Linux every day.

00:14:45
It's my default.

00:14:46
When I opened up my

00:14:47
Windows terminal, the

00:14:47
default is Ubuntu.

00:14:49
Not PowerShell.

00:14:50
Yeah.

00:14:51
And that makes sense.

00:14:52
And it's amazing because

00:14:54
it is actually running

00:14:55
a Linux kernel on a,

00:14:56
what is it called?

00:14:56
A micro VM, whatever

00:14:58
you call that.

00:14:59
I actually compiled an

00:15:01
embedded Linux kernel

00:15:03
in WSL2, and build it,

00:15:05
and wrote it right to

00:15:08
our embedded system.

00:15:09
Because it is, well, it

00:15:11
was actually an embedded

00:15:12
Linux build inside of

00:15:14
Docker, inside of WSL2,

00:15:15
inside of Windows, which

00:15:17
was kind of freaky,

00:15:19
but also pretty cool.

00:15:23
It is.

00:15:24
I've done, I've

00:15:25
compiled the OpenJDK

00:15:26
a few times on a

00:15:28
subsystem for Linux.

00:15:29
And-

00:15:30
That makes sense.

00:15:31
It works just fine.

00:15:32
I produce a Linux

00:15:33
binary off the JDK in

00:15:34
my Windows computer.

00:15:36
But I do want to point

00:15:37
out that I am probably

00:15:40
the weirdest Microsoft

00:15:42
employee because my main

00:15:44
computer is a Windows

00:15:45
PC and then I have

00:15:49
a Mac, an Intel Mac.

00:15:52
Right.

00:15:53
And I have a third

00:15:54
computer, which is

00:15:55
another Mac, the M1 Mac.

00:15:58
It's not here in my

00:15:58
office, but I have

00:16:00
three computers in that.

00:16:02
I was, two years ago, I

00:16:04
was doing an experiment

00:16:05
about JVM ergonomics.

00:16:08
And I set up a lab on

00:16:11
my office where I had

00:16:14
a Java backend running

00:16:16
on my Windows computer,

00:16:18
a database running on

00:16:20
Mac and the workload

00:16:23
generator to stress

00:16:25
test on my Intel Mac.

00:16:27
So I was running

00:16:28
three different Java

00:16:30
binaries of Microsoft

00:16:31
Build of OpenJDK.

00:16:32
One for windows, one for

00:16:33
Mac and Apple silicon,

00:16:34
and one from Mac Intel.

00:16:36
So, you know, just to

00:16:38
show that we want every

00:16:41
developer, no matter

00:16:43
where they are, to be

00:16:44
able to take advantage

00:16:46
of all the technologies

00:16:47
that we produce.

00:16:48
All right, cool.

00:16:50
So let's, let's quickly

00:16:52
go back to, or quickly,

00:16:53
let's go back to the

00:16:54
AI part, because

00:16:55
I think there's one

00:16:56
very specific and

00:16:58
very interesting open

00:16:59
source project that

00:17:00
you were fairly deeply

00:17:02
involved in, which was

00:17:04
the Semantic Kernel

00:17:05
for Java, right?

00:17:08
So maybe, maybe

00:17:09
just give a quick

00:17:10
introduction for the

00:17:11
people that haven't

00:17:12
heard about that.

00:17:14
So Semantic Kernel

00:17:16
is a Java library to

00:17:17
help developers build

00:17:21
intelligent apps.

00:17:23
You imagine like,

00:17:24
okay, I have Spring,

00:17:26
I have Quarkus, I have

00:17:28
Express.js, I have

00:17:30
Django for Python.

00:17:33
These are all frameworks

00:17:34
that help developers

00:17:36
build business

00:17:37
applications in general.

00:17:42
Semantic Kernel is a

00:17:44
library, but also sort

00:17:46
of like a framework

00:17:47
that integrates with

00:17:49
your application to help

00:17:51
you build intelligent

00:17:53
features that will

00:17:54
use, of course, LLMs,

00:17:57
large language models,

00:17:58
and AI services

00:17:59
behind the scenes.

00:18:01
So that allows

00:18:02
the developer to

00:18:03
not only do prompt

00:18:05
engineering, but also

00:18:06
do AI orchestration.

00:18:08
So, you provide

00:18:10
plug ins, you provide

00:18:12
functions, you

00:18:13
provide capabilities

00:18:16
in the application.

00:18:17
Then you tell Semantic

00:18:18
Kernel, these are

00:18:19
my capabilities.

00:18:21
And then you can

00:18:22
allow a prompt to

00:18:26
say, I want to do

00:18:26
this, this, and this.

00:18:28
That's the prompt.

00:18:29
Imagine that's what

00:18:30
you tell ChatGPT.

00:18:32
But that prompt will

00:18:34
make the AI behind the

00:18:35
scenes leverage the

00:18:37
capabilities of your

00:18:38
own application to

00:18:39
trigger those things

00:18:41
as part of the flow.

00:18:43
So that allows you

00:18:45
to say, Hey, give me

00:18:48
the last financial

00:18:49
resource and email

00:18:51
that to my boss.

00:18:53
Okay, you have a

00:18:54
function that extracts

00:18:56
the latest, or extracts

00:18:58
financial results from

00:18:59
the database, and you

00:19:00
have another function

00:19:01
that sends email.

00:19:02
But you also have

00:19:03
a function that

00:19:04
finds the email

00:19:05
address of employees.

00:19:09
But your prompt is just

00:19:10
something like, get

00:19:12
the latest results and

00:19:13
send to my manager.

00:19:14
Okay, the system

00:19:15
knows who the user is.

00:19:17
The system has a

00:19:18
function that extracts

00:19:19
the employee directory.

00:19:21
So the system will

00:19:22
know who is your boss.

00:19:24
And the system will

00:19:25
be able to find the

00:19:26
email of your boss.

00:19:27
And the system also

00:19:28
has the capability

00:19:28
to send an email.

00:19:30
So, Symantec Kernel

00:19:31
will chain all of that

00:19:34
and execute the request

00:19:37
as per the prompt.

00:19:39
So Semantic Kernel

00:19:41
means, or that, that

00:19:43
means it basically

00:19:44
helps you with the like

00:19:45
agents, like the AI

00:19:48
agents that everyone

00:19:49
is talking about.

00:19:50
Okay.

00:19:50
Yes, exactly.

00:19:51
That makes sense.

00:19:52
And there are other

00:19:54
alternatives, of

00:19:54
course, LangChain4j

00:19:56
is the main one.

00:19:58
The Spring folks

00:19:59
have something called

00:20:00
Spring AI, also

00:20:01
another library that

00:20:02
does similar things.

00:20:05
I believe Spring AI

00:20:06
has also integration

00:20:08
with LangChain4j.

00:20:09
And we are also

00:20:11
considering Spring

00:20:12
integrations for

00:20:13
Semantic Kernel, Quarkus

00:20:14
integrations, and

00:20:15
Micronaut, and all the

00:20:17
other Java frameworks,

00:20:18
application frameworks.

00:20:20
But, in essence, that's

00:20:21
what Semantic Kernel is.

00:20:22
It allows developers to

00:20:24
more easily integrate AI

00:20:27
capabilities into their

00:20:28
business applications.

00:20:29
And I guess it intro

00:20:31
or in the backend, it

00:20:33
uses all the typical

00:20:34
LLM services like

00:20:36
Microsoft AI or-

00:20:39
No, you choose, right.

00:20:41
The library is

00:20:42
just a library.

00:20:43
So you have to

00:20:43
tell, okay, this

00:20:44
is my AI service

00:20:45
that I want to use.

00:20:47
So, right now we have

00:20:48
support for Azure

00:20:49
OpenAI, of course.

00:20:51
We have also support

00:20:52
for OpenAI, which is

00:20:53
a different company.

00:20:54
Yep.

00:20:55
We have support

00:20:57
for Hugging Faces.

00:20:59
And I believe, I'm not

00:21:01
sure if we have that

00:21:02
support yet, but we do

00:21:04
have a ticket on GitHub

00:21:07
about Gemini, about

00:21:08
Google's AI service.

00:21:11
I mean, we are open to

00:21:14
include any AI service

00:21:15
into the framework.

00:21:18
And if Google folks

00:21:20
are asking for that,

00:21:22
we can evaluate and

00:21:23
maybe collaborate with

00:21:24
them to make sure.

00:21:26
The interesting thing

00:21:26
about Semantic Kernel

00:21:29
versus LangChain4j is

00:21:31
some customers like the

00:21:33
idea of having a company

00:21:35
behind a library, right?

00:21:38
LangChain4j is great.

00:21:40
There's lots

00:21:40
of engineers.

00:21:41
I know Liz is one of

00:21:43
the engineers behind.

00:21:44
She's amazing.

00:21:44
And she's doing

00:21:45
an amazing job

00:21:46
on LangChain4j.

00:21:47
But it's a community

00:21:49
project and that

00:21:51
by itself can be

00:21:52
a blessing, but it

00:21:54
can also be a curse.

00:21:55
And sometimes some

00:21:57
companies may think,

00:21:58
you know, I don't

00:21:59
want to bat on a

00:22:00
community library.

00:22:02
So, I really prefer

00:22:04
to use a ladder

00:22:04
that is maintained

00:22:05
by a major company.

00:22:07
And that's one of the-

00:22:08
That's one of the

00:22:10
value propositions

00:22:10
for Semantic Kernel.

00:22:11
Yeah, that makes sense.

00:22:12
Especially when you look

00:22:13
into bigger companies,

00:22:14
lawyers tend to get like

00:22:17
a little bit very, when

00:22:19
you, when you just want

00:22:21
to use some open source

00:22:22
project, and they know

00:22:23
they can't keep anyone

00:22:28
in, in check if stuff,

00:22:29
stuff happens, right.

00:22:33
So you said all the

00:22:36
services, could I

00:22:37
also use a lemma?

00:22:39
Building my own

00:22:40
local thing.

00:22:42
If not, there

00:22:42
would be a ticket

00:22:43
for that, I guess.

00:22:46
Hey, it's, you know,

00:22:47
it's a GitHub project.

00:22:49
Go send a pull request.

00:22:51
I can just imagine that

00:22:52
a lot of people would

00:22:53
love to play with that.

00:22:55
But also love to use

00:22:56
some local service

00:22:57
to try it out.

00:22:59
I think, I think

00:23:00
we have that.

00:23:01
We do have that in mind.

00:23:03
I think, I wonder if I

00:23:05
have, if we have that

00:23:07
as a work in progress

00:23:09
or if it's still

00:23:09
just in the back, I

00:23:10
have to double check.

00:23:12
It's just that Symantec

00:23:13
Kernel is not the

00:23:14
only project that

00:23:15
I look at and help

00:23:18
coordinate at Microsoft.

00:23:19
We have a Build of

00:23:19
OpenJDK, we have a

00:23:20
migration tooling,

00:23:21
we have some of our

00:23:23
marketing initiatives

00:23:24
and campaigns, and

00:23:26
Azure features for Java

00:23:27
developers, VS Code,

00:23:29
Java features, there's

00:23:30
a bunch of other stuff

00:23:31
that I participate

00:23:33
and contribute to.

00:23:34
So, that's why I'm,

00:23:35
you will hear me a lot.

00:23:36
Yeah, I'm not sure.

00:23:38
We'll double check.

00:23:39
To be honest, I think

00:23:40
that it's a great

00:23:41
trait to be able to

00:23:43
say that because you

00:23:44
can't know everything.

00:23:45
It's always important

00:23:46
that you know where

00:23:47
to look it up, right?

00:23:48
So we crossed the

00:23:49
20 minutes mark like

00:23:50
two minutes ago.

00:23:52
So last question, like

00:23:54
when you look into the

00:23:55
future, what do you

00:23:56
think is like the next

00:23:57
big thing for Microsoft,

00:23:59
for database, for AI,

00:24:00
for Java, for whatever

00:24:02
you can think of?

00:24:04
I truly believe

00:24:05
in two things.

00:24:05
One, remote development

00:24:10
environments

00:24:11
are become the-

00:24:12
will become the norm.

00:24:15
Whether that remote

00:24:16
environment is in

00:24:17
the cloud or not,

00:24:18
that's irrelevant.

00:24:19
What is relevant is

00:24:20
that for the developer,

00:24:22
they have access to

00:24:23
a remote environment

00:24:25
that is more powerful

00:24:26
than the machine that

00:24:27
they currently have,

00:24:28
and will allow them

00:24:30
to do all sorts of

00:24:32
integration tasks,

00:24:33
because the complexity

00:24:35
of systems these

00:24:36
days has been growing

00:24:38
exponentially, you know.

00:24:39
And that requires

00:24:42
environments that

00:24:43
are more complex to

00:24:45
be able to develop

00:24:46
and most importantly,

00:24:48
test those systems.

00:24:50
So the need for proper

00:24:52
environments that mimic

00:24:55
production environments

00:24:56
will only grow.

00:24:58
And I know there is

00:25:01
Minikube and, you

00:25:03
know, Q3S and all these

00:25:05
sort of, you know,

00:25:07
capabilities to have

00:25:09
a similar environment

00:25:10
in your laptop.

00:25:10
But the reality is on

00:25:11
your laptop, you'll

00:25:12
also have Chrome,

00:25:14
Slack, Discord,

00:25:16
Spotify, and all these

00:25:18
other things that are

00:25:18
eating CPU and memory.

00:25:20
So, you need a dedicated

00:25:22
developer environment

00:25:23
if you need a database,

00:25:24
an AI LLM service.

00:25:27
You need Kubernetes,

00:25:30
you need a backend,

00:25:31
you need Key Vault, all

00:25:34
these other services

00:25:35
that your application

00:25:36
relies on, you're

00:25:37
going to need that.

00:25:38
So emulating all of that

00:25:40
in your local computer

00:25:41
will be impossible

00:25:43
moving forward.

00:25:44
So I think remote

00:25:45
development environments

00:25:46
will become a big thing.

00:25:48
And the second thing

00:25:48
is developers will

00:25:52
have to adapt to using

00:25:55
AI as a tool, not

00:25:58
as a capability for

00:25:59
their applications.

00:26:00
I mean, developers will

00:26:02
have to build AI or

00:26:03
intelligent capabilities

00:26:05
into their systems.

00:26:06
That's certain

00:26:08
to happen.

00:26:10
But not for

00:26:11
every system.

00:26:12
Not every system needs

00:26:13
an intelligent feature.

00:26:15
Tell that the world,

00:26:17
please say it again.

00:26:18
Not every,

00:26:19
Not every system needs

00:26:21
an intelligent feature,

00:26:23
but every developer

00:26:25
will need to leverage AI

00:26:29
in some shape or form.

00:26:30
Okay.

00:26:30
In the past, I remember,

00:26:34
I remember my old days.

00:26:35
We went as software

00:26:36
engineers, we

00:26:37
learned how to

00:26:38
program using books.

00:26:40
Then the internet

00:26:41
came and we continued

00:26:43
to learn through

00:26:44
books, just that, you

00:26:46
know, online books.

00:26:47
And then very quickly,

00:26:49
we transitioned to

00:26:50
articles and tutorials

00:26:52
and user groups and

00:26:54
online communities.

00:26:55
And then we switched

00:26:57
to videos, a lot of

00:26:58
video on the internet,

00:27:00
where we are capable

00:27:02
of, you know, extracting

00:27:05
information faster

00:27:05
by someone just

00:27:06
telling, you know, just

00:27:07
getting to the point.

00:27:09
Now, we are digitizing,

00:27:12
digitalizing, and

00:27:13
aggregating all that

00:27:14
information in a way

00:27:16
that you can just

00:27:16
ask the internet.

00:27:17
You know, Isaac Asimov,

00:27:21
famous philosopher, said

00:27:23
he predicted that the

00:27:25
internet would exist.

00:27:26
Not the internet, he

00:27:27
predicted that a system

00:27:28
would allow anyone-

00:27:31
will allow anyone to

00:27:31
just ask a question

00:27:32
and this computer

00:27:33
thing would just

00:27:34
tell you the answer.

00:27:37
That today, we thought

00:27:39
it was the internet,

00:27:41
but the internet got,

00:27:42
let's say abused.

00:27:48
The answer to that

00:27:49
is not the internet.

00:27:51
The answer to that

00:27:52
is Gen AI, but it's

00:27:55
thanks to the data

00:27:56
that the internet

00:27:58
produce, produces, that

00:28:00
GenNi is capable of

00:28:01
answering the question.

00:28:02
So, LLMs, I believe LLM

00:28:04
is the ultimate answer

00:28:06
to Isaac Asimov's quote.

00:28:10
So we developers have to

00:28:12
be able to, I believe,

00:28:14
use this as a tool so

00:28:16
we can develop faster,

00:28:17
better, and, you

00:28:22
know, find, maybe

00:28:24
find that joy again

00:28:25
in programming.

00:28:26
You know, I learned

00:28:27
how to program C++.

00:28:29
Now, did I

00:28:30
actually learn how

00:28:31
to program C++?

00:28:32
No.

00:28:33
Any C++ developer

00:28:34
will say no, where

00:28:35
he didn't learn.

00:28:35
You learn how to ask

00:28:37
good questions on

00:28:37
ChatGPT and copy paste.

00:28:40
And I was like, yeah.

00:28:42
Isn't that what

00:28:43
programming is at

00:28:44
the end of the day?

00:28:45
You know, majority of

00:28:46
developers do that.

00:28:47
They go on Google and

00:28:48
they find an article.

00:28:49
Then they go and

00:28:50
read the article.

00:28:51
They understand

00:28:51
the article.

00:28:52
They copy paste.

00:28:52
It works.

00:28:53
Done.

00:28:54
Next problem.

00:28:55
We humans are like that.

00:28:56
We tend to search for

00:28:58
things when we have

00:28:59
a problem to solve.

00:29:00
We tend to learn when we

00:29:04
have the need to learn.

00:29:06
So, if we can learn

00:29:08
faster, and AI and LLMs

00:29:11
allow us to do that,

00:29:13
then we definitely

00:29:14
should leverage that.

00:29:15
So those are my

00:29:15
two predictions.

00:29:16
Remote development

00:29:17
environments and Gen AI

00:29:18
or LLMs will be a daily

00:29:21
tool for developers.

00:29:22
Okay, cool.

00:29:25
I have my own thoughts

00:29:26
on that, but for

00:29:27
the sake of time-

00:29:30
my- #MyOpinion.

00:29:32
Exactly.

00:29:35
Exactly.

00:29:35
All right.

00:29:36
I like that.

00:29:37
I think that is

00:29:38
a perfect way

00:29:39
to end the show.

00:29:40
My opinion.

00:29:41
Stop that.

00:29:42
#MyOpinion.

00:29:44
All right.

00:29:45
Bruno, thank you

00:29:46
for being here.

00:29:47
It was awesome

00:29:48
to get the chance

00:29:49
to talk to you.

00:29:49
It's been a while.

00:29:50
And for the audience,

00:29:52
thank you for

00:29:53
being here as well.

00:29:54
And you know where

00:29:55
to find us next week.

00:29:56
So see you again.

00:30:00
The Cloud Commute

00:30:01
Podcast is sponsored by

00:30:02
Simplyblock, your own

00:30:03
elastic block storage

00:30:04
engine for the cloud.

00:30:05
Get higher IOPS and

00:30:06
low predictable latency

00:30:08
while bringing down your

00:30:09
total cost of ownership.

00:30:10
www.simplyblock.io