In this episode of The Episodikal Podcast, we embark on a journey through the meteoric rise of AI, revisiting the moment it transitioned from a distant dream to a profound reality.
We delve into AI's impact on creative professions, highlighting its dual role as both an enhancer and a challenge to human creativity. Further, we explore the imperative shift needed to leverage AI against the pressing threats of climate change, advocating for sustainable technology as a scientific priority. Reflecting on the dual-edged nature of AI, this episode serves as a clarion call for collective action and unity in the face of a looming climate apocalypse. Join us in this not just a conversation, but a rally for change, urging all to heed the call and act towards forging a better future.
Global Crisis. This Already Affects Everyone | International Online Conference 24.07.2021
Global Crisis. The Responsibility | International Online Forum | EDITED VERSION
Global Crisis. The Responsibility | December 2, 2023, simultaneous interpreting into 100 languages
We love receiving your feedback ❤️ Drop us a line anywhere you happen to come across our posts 🙂
We are @episodikal on Instagram, Facebook, Twitter, Telegram, TikTok, and LinkedIn, or email us at ask@episodikal.com
Alexey: Okay guys, we are live again, two months more or less
00:00:06
since we recorded last time.
00:00:08
So it's not for nothing that, we, called the podcast
00:00:11
Episodikal because it is episodical . We try to be
00:00:15
as consistent as possible, but life gets in the way.
00:00:18
Nevertheless, it's already 27th episode that we are recording.
00:00:23
It's been quite some time, but several years ago, in July 2021,
00:00:29
during the International online conference "Global Crisis.
00:00:32
This Already Affects Everyone", we were discussing AI
00:00:36
technologies, and, most importantly, how they will
00:00:39
change the landscape for people who work in various industries.
00:00:43
Contrary to popular beliefs, that robots will replace
00:00:47
the menial tasks, we've been talking that guys,
00:00:50
the creative work, creative workers, also programmers,
00:00:56
in the first place who will start losing their jobs.
00:00:59
And this is what we've been seeing increasingly
00:01:03
these last years since the conference took place.
00:01:06
We've been exchanging on this topic with several guys who
00:01:10
participated in the conference.
00:01:12
And what we see, things are coming true, what
00:01:15
we've been saying.
00:01:15
I think you maybe Taliy have more information on this,
00:01:20
given that you have closer ties to the Silicon Valley.
00:01:23
Taliy: Yes, friends.
00:01:23
So it's definitely a pleasure being here with you again.
00:01:27
27th episode.
00:01:28
So we are really, and thanks to Alexey and his involvement,
00:01:34
largely using AI to produce this podcast to cut out
00:01:39
all the unnecessary parts to make it sound smoother.
00:01:44
I know Alexey is using a lot of AI tools, and really
00:01:47
being the head of the world with predicting these things
00:01:52
in 2021 at "Global Crisis.
00:01:54
This Already Affects Everyone." We did not give exact
00:01:56
estimates of the time because, we know that it will take
00:02:00
very short period of time.
00:02:02
What we said is that it would probably take about a
00:02:06
couple of years to completely remove all human involvement
00:02:12
in coding process and in production of videos.
00:02:16
And we spoke about big films.
00:02:19
So when last year there was the first protests by background
00:02:23
actors in Hollywood who were saying that we cannot allow this
00:02:27
artificial intelligence to just create background actors instead
00:02:32
of real people, this is unfair.
00:02:34
People are going to stay without jobs.
00:02:36
We were looking at it like, wow, we were talking about it, and
00:02:41
nobody wanted to listen to us.
00:02:43
I remember we interviewed people who would brag that we came
00:02:46
out with a, we coined the term general artificial intelligence.
00:02:51
We are the very pioneers of the whole artificial intelligence.
00:02:55
And they were saying like, guys, what you talking about at
00:02:58
this conference was too much.
00:03:00
It's never gonna happen this fast.
00:03:02
It sounds like doom and gloom.
00:03:04
It's, it's impossible.
00:03:06
And guys, only three years later, Sora with OpenAI already
00:03:11
showing you the quality of video that can still there
00:03:16
are still minor things to be fixed, but it's very impressive,
00:03:20
and this is something that you could not expect.
00:03:23
And what we spoke about, that the "Global Crisis.
00:03:24
This Already Affects Everyone," is the exponential growth of AI.
00:03:29
Something that people, even involved in the
00:03:32
field, do not recognize.
00:03:33
That every month, it's doubling.
00:03:37
it's capacity, doubling in size, it's possibilities, and that's
00:03:42
why if you take a year ago, that, video of Will Smith eating
00:03:46
some pasta out of the bowl, that looked ridiculous, that
00:03:50
looked like a very unrealistic video, to put it lightly.
00:03:53
But today when you see in that video of lady, I think
00:03:58
that was one of the videos they proposed for us to
00:04:01
check out their website.
00:04:01
Lady walking through Tokyo.
00:04:03
And you can see reflections, you can see all the
00:04:05
lights, neon lights, you can see people around.
00:04:09
Everything looks very realistic, and you could be tricked, if
00:04:12
you don't pay attention to small, tiny details, but I
00:04:14
think it's gonna take very little amount of time until
00:04:18
we're gonna see the first film completely generated by AI.
00:04:23
And that's very interesting.
00:04:24
They might do some hype on presenting it as an AI film,
00:04:27
but they also can, just trick us into suggesting this is a
00:04:31
real video and we're not gonna be able to tell the difference.
00:04:34
And this is, friends, what is really important to recognize.
00:04:37
These technologies will put Hollywood out of
00:04:41
business very soon.
00:04:43
Again, when we spoke about artists, that music industry
00:04:47
will be replaced by AI, that shows will be generated by AI.
00:04:52
And right now, what is happening in London, you can see ABBA,
00:04:56
your favorite musicians from the 80s, performing.
00:04:59
It's all sold out.
00:05:01
Yes, it took a lot of money to create the show, but
00:05:03
now the show generates two million dollars a week, which
00:05:07
is breaking all the records, all the predictions, and
00:05:11
everybody's standing in line to get a show for their artists.
00:05:15
So their artists can perform across the globe, in multiple
00:05:19
places, at the very same time, people know it's AI and people
00:05:23
still stand in line to give hundreds of dollars per ticket.
00:05:27
So this is a new field and we are proud to say we were ahead
00:05:31
of the rest of the world, and probably a few people were
00:05:35
able to predict this, but nobody said it out loud like
00:05:38
Creative Society did in 2021.
00:05:41
Alexey: Also, I've heard a lot of things that
00:05:44
people are raving about AI creating personalized music.
00:05:48
You tell what kind of music you like and the thing
00:05:50
generates music that is really enjoyable to you
00:05:56
based on your preferences.
00:05:57
And we've been talking about this as well, that
00:06:00
we will have music and also later films created
00:06:05
specifically for each of us.
00:06:07
And this is interesting, exciting and scary at the same
00:06:11
time, because although, the AI can seem to be creating
00:06:17
something new, it doesn't.
00:06:19
Because humans have to create it first, then by combining
00:06:24
different elements of what humans could create before,
00:06:28
we obtain new results, but it's not dramatically
00:06:32
new or anything like this.
00:06:34
Nevertheless, we still have this trend that people really
00:06:39
enjoy having AI everywhere in their daily lives.
00:06:45
And as you said, producing these podcasts without AI,
00:06:48
when we just started, there was not a lot of things
00:06:52
available to do it, but, right now, really the time to
00:06:56
produce the episode is reduced dramatically using AI tools.
00:07:00
And of course, we could do even less work because right now you
00:07:06
have, even on the riverside, we are using to record this.
00:07:09
More or less they have one click export that reduces silences
00:07:14
and generates the video.
00:07:15
What we are using, for example, I can tell that we use AI
00:07:19
to reduce echoes, to remove noises from the recording.
00:07:23
So we don't have to be in acoustically treated
00:07:26
rooms all the time.
00:07:27
So we can record anywhere without spending time
00:07:31
to sound treat the room.
00:07:33
Then we are using AI to remove repeated words or, weasel words.
00:07:38
This is already a big change in how we did, because the
00:07:42
first episodes really, I had to cut everything by hand.
00:07:46
Now, AI can highlight for me things that basically are not
00:07:51
needed, as I said, the repeated words and the things that you
00:07:55
wouldn't want in your recording.
00:07:58
And the whole editing process is already like
00:08:00
editing a Word document.
00:08:02
Well, with minor differences that you still have to correct,
00:08:05
because sometimes it does not recognize the end of the
00:08:08
word or something like that.
00:08:09
So you need to drag the boundaries of the word
00:08:12
on the waveform and then make your edit.
00:08:15
But still, it cut our production time like
00:08:18
maybe by three times.
00:08:20
And what I wanted also to say about coding, is that although
00:08:26
many people say that, for example, GitHub Copilot, it's
00:08:29
like a glorified autocomplete.
00:08:31
Still, I think that we have to remember that AI
00:08:37
is constantly learning.
00:08:38
It's constantly improving.
00:08:40
It's not like us humans, we work a little bit.
00:08:43
We need to have some rest.
00:08:45
We are bored or for example, we want to change what
00:08:49
we are doing, and we are not constantly improving.
00:08:52
AI is not sleeping, it is improving all the time.
00:08:54
And this means that the growth is exponential and what we
00:09:00
cannot see today, for example, of course, we are not yet seeing
00:09:05
AI that just by you explaining what you want, creates a
00:09:11
complete working software.
00:09:13
Not yet, but it is already announced, if I'm not mistaken,
00:09:18
by OpenAI, that they will soon be launching a new model that
00:09:22
will basically take your voice input, you would tell what you
00:09:27
want to see in your application, what it has to be able to do,
00:09:31
and the AI will create a working piece of software for you.
00:09:37
And this is crazy because even right now, I've been working
00:09:40
with some graphs recently.
00:09:42
These graphs were taken from a website and, not all the values,
00:09:46
for example, for columns were given, just several of them.
00:09:50
But I wanted to make more detailed analysis, so I fed
00:09:55
the image to ChatGPT and I explained what I wanted it to
00:09:59
do, that I need to measure the height of the columns, given
00:10:05
that the highest column in this array is to the right and
00:10:09
its value is, let's say, 600.
00:10:11
And by visual analysis, ChatGPT could extract the
00:10:16
exact values of each column.
00:10:18
Then, I gave the chat which kind of function for extrapolation
00:10:23
I would like to use.
00:10:25
And it built me a graph continuing with the
00:10:29
data from the image that it was fed previously.
00:10:33
And imagine that for us to do this kind of thing, we
00:10:37
would need to take a ruler and measure and note everything
00:10:41
down on a piece of paper, then compare the lengths
00:10:44
and divide just to obtain a reasonably correct number of
00:10:49
the height of each column.
00:10:51
CharGPT is already seeing these things.
00:10:54
What was the most interesting, it shows you the Python code it
00:10:58
generates to analyze the image.
00:11:00
You can see which libraries it uses.
00:11:03
You can see the code and it explains every single
00:11:07
line what it was doing.
00:11:10
So, it determines the dominant color.
00:11:12
So it's the background.
00:11:13
Then it determined the color that the columns
00:11:16
were filled with, and then it was calculating the
00:11:19
heights of the columns.
00:11:20
It shows you how computer vision, not exactly the
00:11:24
full blown computer vision, but how it proceeds with
00:11:27
analyzing the image.
00:11:28
And then you understand that, yes, we are already there that
00:11:32
when you know how to talk to this AI, and you have to be
00:11:37
really specific if you want to get good results, but then
00:11:41
it can be a great helPer.
00:11:44
On the contrary, what really bothers me is that more and
00:11:49
more news about people getting fired by thousands from IT
00:11:54
companies are getting in.
00:11:56
And this is what we've been talking about.
00:11:58
It's very concerning that with everything, Google is again
00:12:02
increasing prices for its services, yet it fires several
00:12:06
tens of thousands of people.
00:12:08
And I see that, it's not because, they are losing money.
00:12:13
They again want more profits.
00:12:16
So we come back to this idea that the king in
00:12:20
our society is profit.
00:12:21
No one is caring about human lives as we were promised
00:12:25
before, before the third industrial revolution, that
00:12:28
robots will do everything, will really work less, will
00:12:32
work less days, less hours.
00:12:34
This didn't really last long, this period.
00:12:37
We've been talking about this also for some time and, yeah.
00:12:41
Have you heard anything recently about these things,
00:12:43
maybe from your friends also who work in the field?
00:12:46
Taliy: Well, you know, in the United States, in the media
00:12:49
field, the biggest scandal was around training AI, not just
00:12:53
how the technology develops, but how it's been trained
00:12:55
for Google, for example, when customers tried to creating
00:12:59
images using AI, they would see that historical images
00:13:03
would be incorrect, that people would not be able to
00:13:07
see certain skin colors on the historical pictures, which
00:13:11
really offended a lot of people because the racial diversity
00:13:15
and inclusion, these policies, really created some sort of
00:13:20
backlash from people who don't understand how, for example,
00:13:24
creating AI images from Second World War with racially diverse
00:13:29
soldiers of German Reich.
00:13:32
How would that be beneficial?
00:13:34
So it doesn't make sense.
00:13:35
And here's another thing.
00:13:36
How would it be trained.
00:13:38
The question is how you train AI and why certain
00:13:42
things being used not in the way that we would expect.
00:13:45
On the other thing, that is exactly what we expected,
00:13:48
that these things will be trying to generate more profit
00:13:52
for billionaires, for people who are already in power.
00:13:55
And as we predicted in 2021, at "Global Crisis.
00:13:59
This Already Affects Everyone," they're going to
00:14:01
play for a couple of years, allow producers to play with
00:14:04
their artists and so on.
00:14:06
And then they just gonna take over the whole thing.
00:14:09
Those who own the technologies, they will replace, we can
00:14:13
already see it in a company in Silicon Valley that,
00:14:17
I'm working in and in other companies, we know that five
00:14:20
years ago, there were a bunch of different tools that were
00:14:23
specific for certain things.
00:14:25
And right now we're seeing that one company that is highly
00:14:29
invested in OpenAI taking over whole market, everything
00:14:33
becomes a product of this one company and they generate
00:14:37
exactly the same software or sometimes even better
00:14:39
quality than any competitors.
00:14:41
So how can you, how can you compete with that?
00:14:44
And the prices they offer, they simply forcing companies to
00:14:48
switch to their suites which include everything from video
00:14:52
calls to libraries to all sorts of things, but it will
00:14:55
be all within this company and stored on the servers
00:14:59
on the different company and eventually everything most of
00:15:03
the companies using in Silicon Valley right now is provided
00:15:06
by two different companies.
00:15:08
So this is the threat that slowly but surely we're
00:15:11
switching to unified one polar world where the dominance, the
00:15:18
financial power of one mega corporation becomes enormous.
00:15:24
And the second threat we've been talking about at our
00:15:28
conferences is that all of these technologies including
00:15:32
digital currencies, including everything, are becoming
00:15:36
very much vulnerable to external influence from space.
00:15:41
Because digital society is very easy to switch off, very easy
00:15:47
to turn into complete chaos, and we've been talking about
00:15:50
films, the one that Netflix put out a couple of months ago,
00:15:54
where a scenario was presented when the whole United States
00:16:00
simply overnight losing all connectivity, all service
00:16:03
and things become chaotic.
00:16:06
And folks, people here in the United States got alarm when
00:16:09
the whole AT&T service for one of the major cell phone
00:16:14
providers, was switched off.
00:16:17
It was simply not working.
00:16:18
A lot of theories.
00:16:19
The national security agencies were in a rush to announce
00:16:23
that was not a cyber attack.
00:16:26
There were also some jokes some satire articles came out
00:16:30
saying that AT&T customers unaware of network outage since
00:16:34
they're used to not having cell signals on their cell phones.
00:16:37
But jokes aside guys, the true reason was not announced.
00:16:42
Not everyone said, and not everyone even knows, that the
00:16:47
third X class solar flare, and X class is the most powerful
00:16:52
solar flare, was released in 24 hours that day, and three
00:16:58
solar flares is abnormal.
00:17:01
The amount of solar radiation that came to
00:17:04
Earth was record breaking.
00:17:07
The problem is that, so far, we have nothing, we
00:17:12
have no means to protect ourselves from these things.
00:17:15
But the most alarming thing, I would say, that
00:17:18
this was a blank shot.
00:17:19
These release of solar flares did not contain the matter,
00:17:23
the solar, hard solar matter, which could destroy our planet,
00:17:28
our magnetic field and so on.
00:17:30
So something abnormal even there, we seeing like
00:17:32
we got lucky three times already in one day in 2024.
00:17:37
And in addition to that, everybody were predicting
00:17:40
that these solar flares will become at the peak
00:17:43
of their cycle in 2025.
00:17:46
But what we're seeing in reality is that the peak of this cycle
00:17:49
seems to be happening in 2024.
00:17:51
Scientists who are making their prediction models
00:17:55
are really puzzled.
00:17:55
Something doesn't fit in here.
00:17:57
Something caused Sun to activate much earlier.
00:18:02
And this is where, again, you have to know
00:18:04
what "Global Crisis.
00:18:05
The Responsibility" forum, which information was
00:18:08
presented over there.
00:18:09
Sun, just like any other body of the solar system, was charging
00:18:15
with this external cosmic energy for past decades, and now it
00:18:19
starts releasing this energy.
00:18:22
It took time to get this charge.
00:18:24
But now we're seeing the consequences.
00:18:26
We're seeing the consequences of overheating of the core
00:18:30
on our own planet, where atmospheric rivers become
00:18:34
every week event in California.
00:18:37
California has been bombarded.
00:18:38
California had more rainfall in one month than
00:18:43
in whole previous year.
00:18:44
And this is something that is not being discussed
00:18:47
widely enough, again.
00:18:48
So people are being distracted by presidential election,
00:18:52
people being distracted by scandals with AI tools,
00:18:56
which, again, politicized.
00:18:58
It's been politicized.
00:18:59
But people are not being told the truth about what is
00:19:03
happening with our nature, what are the actual devastating
00:19:06
events that will happen.
00:19:08
And only a few speaking openly that there is a hundred percent
00:19:11
possibility, hundred percent possibility that we will be
00:19:15
hit by a solar storm that will switch off most of our
00:19:20
electronic devices, and our means of connection will be off.
00:19:25
What our life will be then, and what will be the consequences,
00:19:28
so far, very few can say.
00:19:30
But even for a matter of national security, when
00:19:33
we're being constantly put into this negative set of
00:19:37
mind that we have enemies all over, that we are on the
00:19:40
brink of the nuclear war.
00:19:42
What are the possibilities of something really terrible
00:19:45
that can happen in this kind of circumstances?
00:19:47
I think we need more public control.
00:19:50
We need more demand from public for the
00:19:52
transparency of information.
00:19:54
And this is something we've been advocating for years, and yet.
00:19:58
I don't see this critical point where people gonna
00:20:00
wake up, unfortunately, to all of these facts.
00:20:04
Alexey: Another thing that, we also talked about is that
00:20:07
all this computing power, it may be used to predict
00:20:12
things like solar flares or climatic events, because we
00:20:16
already have enough data.
00:20:17
There's plenty of applications like Earthquake Pro, Space
00:20:21
Weather, and we've been monitoring these things
00:20:24
for quite some time.
00:20:25
And when you see what is happening all around the planet,
00:20:29
developing AI to generate beautiful images, or virtual
00:20:35
girlfriends, this is apparently like the new fashionable thing.
00:20:39
Yeah, it's nice new AI filters for your TikTok or Instagram.
00:20:44
It's really cute, but we have some things on our hands
00:20:48
that need urgent solving, and we haven't seen scientists
00:20:54
yet using these new tools, powerful tools that can help
00:20:59
in analyzing all sorts of data, making predictions.
00:21:02
We don't see them using these technologies to
00:21:07
explain people what is happening with the planet.
00:21:09
Everyone is already seeing.
00:21:11
I mean, there is no possibility to deny that
00:21:14
things are changing.
00:21:16
Maybe for some people it's somewhere on the other side
00:21:19
of the planet, but when your friend's backyard is now
00:21:23
at his neighbor's backyard because of the landslide
00:21:28
in California, in LA, you can't miss these things.
00:21:32
During previous episode, I told you about, my
00:21:34
friend's house being washed away in the Black Sea.
00:21:38
I don't want again to go into these things.
00:21:41
People can go and watch the forums, they're
00:21:43
available online.
00:21:44
When we were preparing this episode, we were talking about
00:21:47
how AI would change the world, how it's already changing the
00:21:51
arena, but we cannot miss these important things that we've been
00:21:56
talking all along for the past couple of years on this podcast.
00:21:59
It doesn't matter what kind of beautiful images
00:22:01
or films or advertisements AI can create for you.
00:22:05
If you don't have electricity, it doesn't really matter
00:22:09
because the AI will not work without power.
00:22:11
And, if your house is no longer there, AI will be
00:22:16
really the bottom of your list of things that are
00:22:19
necessary for your survival.
00:22:20
It's great that we kind of progress somewhere
00:22:24
with the technologies.
00:22:25
But again, these technologies are used to earn more money
00:22:29
for the people who already have all the money of this world.
00:22:32
And remember, we talked during "Global Crisis.
00:22:35
This Already Affects Everyone," we've been talking that
00:22:37
the competition will be very fierce between all the
00:22:41
players in the AI field.
00:22:43
And you mentioned this thing that this company that's really
00:22:47
involved heavily in OpenAI.
00:22:49
We can say it's Microsoft.
00:22:50
it's not a big secret.
00:22:52
We kind of see that Google, who was pioneering, but
00:22:56
it's AI is lagging behind.
00:22:59
And this is where we come again to the things that we
00:23:03
told almost three years ago, two and a half years ago,
00:23:07
that people who will have the company who will win the AI race
00:23:11
will really rule this world.
00:23:14
And this is what's happening because imagine, for example,
00:23:17
you have your, I don't know, Office 365 subscription and,
00:23:21
it comes with email that's, you can like or not like it,
00:23:25
personal preference, but, the AI tools that will be available
00:23:30
only if you use these services from Microsoft will make you
00:23:35
decide if you finally stay with Google with their crappy AI
00:23:41
that it is right now, or you switch to Microsoft because
00:23:44
you want, you need these tools to do your work and to be more
00:23:49
efficient than, for example, your peers in the field.
00:23:52
And we come again to this realization that yes, guys,
00:23:57
AI is king, and whoever will rule this space will basically
00:24:04
rule all aspects of our lives.
00:24:06
Well, we haven't seen Apple making their move yet.
00:24:11
We talked also about this, with you when we talked on the
00:24:13
phone, that I strongly doubt that they have nothing at all.
00:24:19
Apple, the biggest company on this planet and the
00:24:22
most advanced, but most secretive as well, that they
00:24:25
have nothing besides Siri that is already available.
00:24:29
Yeah, it's interesting to see where all this will go,
00:24:34
but, at the same time, we have to remember and keep in
00:24:37
mind that this will probably not last long if all that
00:24:43
we run after is money.
00:24:45
This I would say should be the main preoccupation for people
00:24:49
who should be putting out this popular demand for scientists
00:24:53
to unite and solve the problems.
00:24:55
Each of these companies trying to win this race,
00:25:00
they could achieve much more if they worked together.
00:25:03
And at the same time, solving the problems that really
00:25:07
matter to every one of us.
00:25:10
Because there is no planet B.
00:25:12
So if there is no planet Earth, then it doesn't matter
00:25:15
who was winning the AI race.
00:25:18
Taliy: Yeah, guys, and for those of you who, who think that
00:25:21
everything is going to be all right, because, you know, I came
00:25:24
across a video on TikTok of lady criticizing Creative Society.
00:25:28
She was saying, you don't have to do anything.
00:25:30
Why would you listen to these guys from Creative Society
00:25:33
who say that you have to fight for your rights and freedoms?
00:25:36
You just have to sit on your couch, just, she says, like
00:25:39
I do, and technology will simply appear in your hands.
00:25:43
Like this cell phone over here, I didn't have to
00:25:45
do anything, she says.
00:25:47
The technology just appeared, and then you just use it.
00:25:50
She goes, so don't listen to Creative Society.
00:25:52
Just sit there, be a couch potato, and wait until
00:25:56
somebody gonna bring you amazing technologies and
00:25:58
you're just gonna sit on the couch and use it.
00:26:01
Be a couch potato, literally.
00:26:03
Hopefully there is a small percentage of people illiterate
00:26:05
like that, who think that somebody is gonna develop
00:26:09
these technologies to bring them light, like Prometheus
00:26:12
from the tales, that, that is gonna take his liver out
00:26:15
and sacrifice himself to bring light to the people.
00:26:18
Unfortunately, the reality is very different.
00:26:20
If we go into the very origin of Google, how
00:26:23
did Google originated?
00:26:24
Of course, there is a romantic story that two great
00:26:28
students from Stanford, Larry and emigrant from Russia,
00:26:32
Sergey, they got together in a garage and they decided
00:26:35
to change the history.
00:26:36
So they created this search engine, which was
00:26:39
funded by DARPA grants.
00:26:42
They received money from National Security Agency and CIA
00:26:46
to develop their first project.
00:26:49
And then one year into that, they bought the National
00:26:53
Security Agency satellites from United States to create
00:26:57
Google Maps, which we use.
00:26:59
So, think about it twice.
00:27:00
Who is actually using these technologies, and what are they
00:27:03
used for in the first place?
00:27:05
Is it to make your human life better, or is it to secure
00:27:08
control and military purposes?
00:27:11
And, you know, I was watching an interview with Governor
00:27:13
Newsom, who is a very intelligent and very well
00:27:16
spoken gentleman, who definitely knows a lot about technologies.
00:27:20
And he was in 2013 way before he became governor in 2019.
00:27:24
He was at Google space, speaking to an audience, the employees
00:27:28
of Google, and he was very advanced with technology.
00:27:31
He had his book written about technology and how it can
00:27:35
be changed, and he created a lot of great examples
00:27:38
as former mayor of San Francisco, how it can be used.
00:27:42
He goes well, you know, we can use these Google Maps,
00:27:45
so where there is a pothole in pavement in San Francisco,
00:27:49
you can just come, take a picture of it and send it
00:27:53
using technologies, using APIs, open APIs that we use.
00:27:57
You can send it to our government officials and
00:28:00
they're going to come and fix this pothole right there.
00:28:04
Well, great ideas.
00:28:05
And again, he's very well spoken gentleman.
00:28:08
The, only thing that ten years after those speech,
00:28:11
the potholes are still there in San Francisco.
00:28:14
But the debate he had with governor of Florida a couple of
00:28:17
months ago, governor of Florida presented a different map, which
00:28:21
was tracked using pictures.
00:28:23
The map of human feces all over San Francisco, and you can see
00:28:27
dark brown color, light brown color, all kinds of brown,
00:28:31
like 50 shades of brown, and which is showing that the
00:28:35
whole San Francisco is a huge public restroom and nobody
00:28:39
cares to even clean it up.
00:28:41
And this is a tragedy that we have great people who speak
00:28:44
beautiful words, but the reality is going the opposite way.
00:28:49
And we are seeing our best cities downgrading and
00:28:53
becoming huge disaster.
00:28:55
And then when, I've read the book, I listened to that
00:28:58
beautiful and well spoken gentleman and I decided to
00:29:01
write an email to him to warn him about things that we've
00:29:05
been predicting and things that he might be not aware
00:29:08
of climate wise, because I believe he cares about people.
00:29:12
I believe he has his vision, which maybe not everyone
00:29:15
understands, but I believe he, like everyone else,
00:29:18
wishes the best for himself.
00:29:20
And he would love to do something good for society in
00:29:23
general, so that's why he became governor in the first place.
00:29:27
But turns out he switched off his email.
00:29:29
It's impossible to write him an email because there was
00:29:32
a scandal more than 10 years ago when journalists in San
00:29:36
Francisco, in times when he was mayor of San Francisco,
00:29:39
they used the Transparency of Information Act to get hands
00:29:43
on his email exchange with Sergey and Larry from Google.
00:29:48
And they posted those emails.
00:29:50
He didn't like that it's been posted.
00:29:51
So he simply deleted his email box.
00:29:55
And that is the level of involvement of society
00:29:58
we are seeing right now.
00:29:59
And we're coming from Ronald Reagan, who was talking about
00:30:03
government who would listen and execute what people want.
00:30:06
We came to the government which deletes its email boxes.
00:30:10
The only way to connect with them.
00:30:12
Now we're talking about connection with people,
00:30:14
like what kind of connection we're talking about if the
00:30:16
only way to connect was email and now even the email
00:30:20
address is not available.
00:30:21
So If you guys know any ways to contact governor of California,
00:30:25
please write in the comment section below because I would
00:30:28
love to inform government officials with all the respect.
00:30:31
It doesn't matter which political affiliation you are.
00:30:34
And again, there is a lot of political hatred and then
00:30:37
people trying to say that this guy is bad and I'm so good.
00:30:41
Guys, please, we all know that there are no good people
00:30:44
getting that far in politics because politics is set up the
00:30:48
way that if there is a nice person truly trying to make a
00:30:53
difference and without being part of this system, the system
00:30:57
tries to destroy this person.
00:30:59
The system tries to smear campaign, to label that person
00:31:03
all possible words, to use scandals, including sex scandal,
00:31:07
fraud scandals, anything.
00:31:10
It doesn't matter if it's true or false, it's, different
00:31:12
scandals which are we seeing, even the next level.
00:31:16
Right now, it comes to the level where the legal suits are being
00:31:20
used as weapon against people who were speaking the truth.
00:31:24
And we can see through the history from Julian Assange
00:31:27
case, who was accused for personal relations first,
00:31:31
because he was having some personal relations and
00:31:34
he was accused falsely.
00:31:36
And then we're seeing the very same thing happened to
00:31:38
multiple people, multiple journalists or people who
00:31:41
had large media influence.
00:31:44
They were attacked financially.
00:31:46
All the money were taken away from them.
00:31:48
They had to file for bankruptcy and that is
00:31:51
the way to shut them down.
00:31:52
They think if we cannot shut them down by damaging
00:31:55
their reputation, we can shut them down by depriving
00:31:59
them of their wealth.
00:32:01
And this becomes, if you see, if you look through all the
00:32:04
cases, this becomes a trend.
00:32:06
So, what is happening here?
00:32:08
We're seeing that there is a game being played by those
00:32:11
who control the media or try to control the media.
00:32:15
They try to introduce more and more different legal means
00:32:20
to shut down the freedom.
00:32:22
We know that if you have a different point of view on
00:32:24
climate, for example, any alternative versions are now
00:32:27
called hate speech openly by United Nations and bodies of
00:32:32
parliaments and structures of global influence around the
00:32:36
world introducing this legal proposals in which If you're
00:32:39
not supporting the traditional views of governments on, for
00:32:43
example, green energy and so on.
00:32:45
And just to give you one example, there is a legal
00:32:48
proposal in Canada right now that says if you speak
00:32:51
nicely of oil industry, you can get jail time.
00:32:55
And this sounded ridiculous.
00:32:57
But this is unfortunately our reality.
00:33:00
We are deprived of our freedoms.
00:33:02
We are being stripped down off the things that used to
00:33:06
be unquestionable nature of human freedom to speak up
00:33:11
what you feel is freedom.
00:33:12
And on the other side, they say this is communist country,
00:33:16
this is communist dictatorship.
00:33:18
Guys, this is simple dictatorship, because communism,
00:33:20
at least you were granted certain kinds of benefits.
00:33:24
In this field, there are no benefits.
00:33:26
You're just being stripped of your freedoms and the question
00:33:30
when the full dictatorship, full, dictatorship arrives.
00:33:35
I think that is a very reasonable question.
00:33:38
And the only thing that we can say, which we've heard
00:33:42
from one of the episodes of programs with Igor Mikhailovich
00:33:46
Danilov, where he said: " a good thing that this dictatorship
00:33:49
will fortunately not happen, because the climate will
00:33:54
not allow it to happen." And this is one thing that
00:33:56
many people don't understand.
00:33:59
With a devastating climate progression, this dictatorship
00:34:03
oppression, which rises in the form of deep state
00:34:07
or tech dictatorship, or AI dictatorship, this
00:34:12
will not be allowed to happen by the climate.
00:34:15
Climate will, unfortunately, either unify humanity
00:34:20
against itself.
00:34:21
So human potential will be used to unify humanity
00:34:24
and build a better future, defeat the climate threat,
00:34:27
and build a really free and beautiful society, which
00:34:31
we call Creative Society.
00:34:33
Or we will be united by one grief, along with those
00:34:37
who creating dictatorship, along with those who
00:34:40
are enslaved today.
00:34:43
More than 40 million people are literally
00:34:45
slaves today in the world.
00:34:46
This is bigger than ever in history.
00:34:48
And unfortunately, most of the media are silent about it.
00:34:51
Most of the population do not care.
00:34:53
Like that lady that says, just be a couch potato,
00:34:57
sit over there, and everything will be great.
00:35:00
There was a Superbowl just a couple of weeks ago.
00:35:02
There was a Superbowl, the biggest events, and one of
00:35:04
the commercials was huge field, bunch of sofas over
00:35:10
there, and people dressed as potatoes sitting on those
00:35:14
sofas in front of their TVs.
00:35:16
And the advertisement was, our free TV subscription will be
00:35:21
so great that it will literally turn you into a couch potato.
00:35:25
It's irony folks, but this is also our reality.
00:35:28
Please folks, dig a little bit deeper and go
00:35:32
back to re watching the conference "Global Crisis.
00:35:36
This Already Affects Everyone" because unfortunately this
00:35:40
reality has two different scenarios of development and
00:35:45
they both have been described.
00:35:46
So it's time to think, time to stop being a couch potato.
00:35:50
Alexey: You know, everything is a tool, and AI is a tool.
00:35:53
A couple of interesting things in the usage of AI.
00:35:56
One is very creative.
00:35:58
I've seen this video that guys who are running a company, I
00:36:01
don't exactly remember what they are doing, but basically
00:36:04
what they did is that they connected their IVR, their
00:36:08
phone support system, to several GPTs from ChatGPT.
00:36:14
One was transcribing all the conversations that
00:36:17
were held by their support agents with the clients.
00:36:20
It was putting these conversations in a Google
00:36:24
spreadsheet or Google doc.
00:36:26
The second one was analyzing the text, for example, which kind
00:36:30
of language they were using.
00:36:33
Did they follow the script that the company has developed for
00:36:37
customer support, and basically grade the performance of
00:36:41
support agents, how polite they were, how helpful they were,
00:36:46
and give advices to how they could improve their support.
00:36:50
So this is very interesting.
00:36:52
And, I would say that maybe Gavin Newsom could be using
00:36:56
this kind of, technology, since he's getting a lot of emails,
00:37:00
he was getting or could be getting if he reinstated his
00:37:05
mailbox, he could analyze, with the help of AI, what really
00:37:11
is bothering his citizens the most, and address these issues.
00:37:17
This could be a really interesting way of using
00:37:20
AI in helping people.
00:37:22
For example.
00:37:24
Another thing that I wanted to mention, there was this incident
00:37:27
recently with Air Canada.
00:37:29
They connected again, an LLM, a large language model,
00:37:33
the ChatGPT that they train specifically to chat with
00:37:37
the clients on their website.
00:37:39
And, this chatbot actually invented a new policy that
00:37:43
he thought, in quotes, he thought would be useful.
00:37:47
And how things went.
00:37:49
So a, a client, who had a bereavement situation, he had
00:37:53
a death in his family and he needed to get quicker somewhere.
00:37:57
Airlines quite often they have a policy that if you
00:38:00
have this kind of thing you can buy the ticket that's
00:38:03
available, it may be at a very high price, but then you
00:38:06
will get a reimbursement for it or something like this.
00:38:09
On the Air Canada website, he was talking with the chatbot.
00:38:12
The problem that large language models, they
00:38:15
tend to sound like humans.
00:38:18
Especially when you are chatting with them, you don't hear
00:38:21
the robotic voice or anything like that, because they are
00:38:24
trained on what real people were saying and writing.
00:38:27
He was chatting and he was asking, look, if I buy this
00:38:31
ticket right now, will I get a refund and everything?
00:38:33
And, the chatbot created, yes, we do have a policy that, allows
00:38:37
you to book this expensive ticket and get a refund later.
00:38:40
So the guy buys the ticket, he goes about his business
00:38:43
and everything, and then he applies for a refund and Air
00:38:46
Canada guys they're telling him but, we have a corporate
00:38:50
policy and it's nothing like what you are telling us.
00:38:53
He said, yeah, but I was talking with your support agent.
00:38:56
And, the guy is saying, well, it's not a support agent.
00:38:59
It's a chatbot.
00:39:00
They did not want to pay him.
00:39:01
So he took Air Canada to court with this and, they were
00:39:06
trying to push the idea that they are not liable for the
00:39:11
actions of the chatbot because it was acting on its own.
00:39:14
And the judge said, no, no guys.
00:39:16
This will not, go well for you.
00:39:18
He ordered Air Canada to pay all the attorney fees
00:39:22
and reimburse this gentleman for this ticket as their
00:39:25
chatbot promised to him.
00:39:27
Why I'm bringing this up is that we have this new
00:39:32
toy that is helping, it is helping, but, who does it help?
00:39:37
Is it really helping normal people?
00:39:40
Well, maybe in some tasks , it helps kids do their homework,
00:39:44
but is it really helping?
00:39:46
Isn't the whole idea of doing the homework by
00:39:50
yourself is to train your own brain, your consciousness?
00:39:55
But then when we see that by helping the corporations,
00:39:59
it eliminates a job, because before, a real person
00:40:04
would be replying in the chat box on the website.
00:40:08
Now it's a chatbot.
00:40:10
Yeah, there's plenty of things that could be
00:40:12
said about implementation of AI in our society.
00:40:16
But, what is most important as with any tool, I would
00:40:20
say is the intention.
00:40:23
If your intention is to improve the lives of everyone on
00:40:27
this planet, then obviously this tool will help you.
00:40:31
The opposite is also true.
00:40:33
That if your intention is to help yourself at the expense of
00:40:38
everyone else on this planet, this tool can help you too.
00:40:42
Because it's just a tool, a hammer is just a hammer.
00:40:45
We have to understand that intention is really everything.
00:40:49
And we've been talking about the intention of helping everyone
00:40:53
realize that things are changing for every one of us living on
00:40:57
this planet and that we need to urgently unite the scientific
00:41:00
potential, unite also the AI potential, because when we
00:41:04
even think, you know, about how things are looking today, we
00:41:09
have all these companies pushing into different directions.
00:41:13
If they would be going to the same direction, we would,
00:41:16
as a society, have the best possible AI with all the NVIDIA
00:41:22
chips running it locally, for example, if they wanted,
00:41:25
because I think this is what they are trying to do right now.
00:41:28
We would have all the best advancements made by Google,
00:41:32
by Microsoft, OpenAI, everyone else in the field.
00:41:36
Imagine we would have the best of everything, but for everyone,
00:41:40
for this, we need to unite.
00:41:42
And this brings back to what we've been talking
00:41:44
on this podcast for a long time, that unity wins.
00:41:47
We will not be able to progress far if we are pushing
00:41:53
into different directions.
00:41:54
Most probably we will even be staying at the same spot.
00:41:58
But I'm curious to see where all these things will go
00:42:02
and how this will affect our lives, but, also I really
00:42:07
hope that people start waking up and see that what we are
00:42:13
being let playing with by OpenAI or other companies.
00:42:17
Imagine what is cooking in their labs.
00:42:20
If what they are giving us to play with already creates
00:42:25
super realistic videos, don't you think that they could be
00:42:29
creating much more realistic, undistinguishable from reality
00:42:35
videos already in their labs?
00:42:37
We've been talking about this that we will arrive to a point
00:42:42
where we will not be able to trust anything that we see
00:42:48
or hear unless we can touch the person in front of us.
00:42:52
Things already going sideways for many people who fell victims
00:42:57
of fraud of scams using videos and audio generated by AI.
00:43:02
I don't remember if I told you about this.
00:43:05
There was this Hong Kong corporation.
00:43:08
A lady, she was invited to a corporate call, a zoom call.
00:43:13
And there were about like 20 people or something like that.
00:43:16
There was her superior, the chief financial officer of the
00:43:19
company and everyone like this.
00:43:21
And she was instructed during a Zoom video call to wire 200
00:43:27
million Hong Kong dollars, it's about 20 million dollars,
00:43:31
to 10 different accounts.
00:43:33
It turned out that among these 20 people on the
00:43:37
call, there were only two persons who were real humans,
00:43:42
herself and her colleague.
00:43:44
All the rest were AI generated personas that the fraudsters,
00:43:49
they grabbed the videos from the website of the company,
00:43:52
presentational videos.
00:43:54
They created AI avatars of everyone in the company.
00:43:58
And then created this Zoom call and, basically the
00:44:02
AI avatar of her superiors told her to wire the money
00:44:09
onto different accounts.
00:44:11
So this is, this is already happening guys.
00:44:13
And it's not the only mishap like this that happened.
00:44:17
So we can see that this is a tool.
00:44:19
And it can be used for good and bad.
00:44:21
And unfortunately, from what we are seeing today,
00:44:25
I think we have more harm from these technologies,
00:44:28
for the moment, at least, in how we are using them today.
00:44:32
Taliy: Exactly.
00:44:33
But, without these technologies, we will not be able to overcome
00:44:37
the challenges of upcoming climatic disasters as well.
00:44:41
We have to understand that.
00:44:43
And, taking even simple examples that people
00:44:45
do not realize enough.
00:44:47
Let's go back a century ago where Los Angeles was bombarded
00:44:52
by large scale floods.
00:44:55
Floods used to happen not as often not every week
00:44:59
like it is happening today.
00:45:00
But there was one once every other decade kind of thing
00:45:04
and the whole Los Angeles was flooded and then in 1938 the
00:45:08
government started creating this concrete channel, channelizing
00:45:12
the LA river, and now it looks like you've seen in the
00:45:16
Terminator, it was used for the Hollywood movies, you can drive
00:45:20
through it, and usually there is no water, you can simply
00:45:22
drive through it in a truck or motorcycle like Schwarzenegger
00:45:26
did with the Terminator movie, and that looked something
00:45:30
like really cool thing for filming music videos,
00:45:34
films, like endless amount of films were filmed there.
00:45:37
What is happening right now is that river is full
00:45:40
to the very top of it.
00:45:41
You can see so many videos of it being simply overflown
00:45:45
and this is insane because amount of precipitation 14.
00:45:50
3 inches was in the middle of February and the usual
00:45:53
annual amount is 14.
00:45:55
2 inches of water.
00:45:57
This is crazy.
00:45:58
In six weeks, more water fell down than in a year.
00:46:02
And this is what we are talking about.
00:46:04
This is an example where technology saves human lives.
00:46:08
Because there is no such river in San Diego, for
00:46:12
example, and you're seeing devastating consequences of it.
00:46:15
You're seeing people having to climb on the
00:46:18
roofs of their own houses.
00:46:20
Simply to survive, you're seeing people being taken
00:46:23
away, washed by the water, the cars are being pushed.
00:46:26
Imagine the amount of power needed to push
00:46:28
the car like that.
00:46:29
This water has tremendous power and this water is going
00:46:34
to be one of our enemies.
00:46:36
We're going to have to battle it.
00:46:37
And this is crazy to see in the comment sections when
00:46:40
you go into Los Angeles Times and other newspapers.
00:46:44
People are mad in the comment section that hey we're not
00:46:48
capturing this water and preserving it and I'm, just
00:46:51
mind blown like how would you complain about that?
00:46:54
We're not capturing the water you are still living in
00:46:56
the reality of 10 years ago where we were having small
00:47:00
amount of precipitation and we would think how could we
00:47:02
capture that water and use it.
00:47:04
You cannot capture this amount of water.
00:47:06
This is simply going to destroy your cities.
00:47:09
This kind of water is also not a clean water.
00:47:11
It's simply rainfall over the cities infused
00:47:14
with bad chemicals that you have to purify.
00:47:17
You cannot simply use that water.
00:47:19
You cannot store this water for the summertime.
00:47:21
This is really a huge amount of polluted water which
00:47:25
simply flows and good thing it goes into the ocean,
00:47:27
but people complain for the wrong cause like people
00:47:29
don't understand the reality.
00:47:32
They've been told certain things for years and years and
00:47:35
they think this is still time of those times to complain
00:47:39
about the that we're not capture capturing water like
00:47:42
this is how much people are lacking understanding about
00:47:44
the reality of nowadays.
00:47:47
We need something different like that LA river it took 30
00:47:52
years to create that thing.
00:47:54
And now it saves millions of human lives there and
00:47:56
millions and billions of dollars in property
00:47:59
that is being preserved.
00:48:01
If it wasn't for that river, the whole city would be
00:48:03
flooded, all your cars would be washed, all your houses,
00:48:06
everything would be destroyed.
00:48:08
Like in many places in San Diego with the whole water
00:48:12
level was that high that it damaged everything that
00:48:15
was on the ground level in basements and the cars
00:48:18
and your bedrooms and your floors, everything is damaged.
00:48:23
It's hundred percent total for the houses
00:48:25
that have been flooded.
00:48:26
Of course you cannot live in those houses anymore.
00:48:28
It will be mold, it will be, not suitable for living at all.
00:48:32
And nobody seems to care about these things.
00:48:34
That is what is the most alarming.
00:48:36
Unless you are directly affected by these things.
00:48:39
You live just a hundred miles away in Los Angeles,
00:48:42
you're gonna be complaining and bickering about the
00:48:45
quality of something else.
00:48:47
You're gonna be complaining about the completely wrong thing
00:48:49
that we're not capturing water.
00:48:51
This is simply insane and you know those people who like our
00:48:54
friends in Los Angeles whose backyards, who had houses on
00:48:58
the hills and whose backyards were simply washed away.
00:49:01
They just simply experienced this landslide where
00:49:04
the whole land simply went into the neighbors
00:49:08
backyard and neighbor's house was below the level.
00:49:11
Or like that house in Dana Point.
00:49:13
Come on guys let's take a look at the footage from Dana Point.
00:49:17
On Google Maps you can still find that house it has a huge
00:49:20
amount of backyard in front of it so it wasn't built
00:49:24
on the cliff it was built quite far from the cliff.
00:49:27
And right now you can see that house miraculously survived
00:49:31
the landslide so you can see a concrete wall, you can see
00:49:35
it's foundation basically sticking out of the cliff and
00:49:38
it's on the edge of collapse and that house used to be
00:49:41
worth 13 million dollars.
00:49:42
Now, it's worth nothing.
00:49:44
You're not gonna sell.
00:49:44
Nobody wants to live in a house that's going to collapse We're
00:49:48
talking about the millionaire or billionaire beach in
00:49:51
Malibu, the very same thing.
00:49:52
The whole thing is going to be gone you we're
00:49:54
talking about places up north the very same thing.
00:49:58
The fanciest places in Santa Barbara are being flooded,
00:50:02
and people do not seem to want to recognize the challenges
00:50:07
of nowadays and how much critical it is to build new
00:50:10
infrastructures, the very same vertical farms, the very
00:50:14
same generators of energy.
00:50:16
And when we're talking about energy, we still live in a
00:50:19
decade where people complain about something that is
00:50:23
absolutely irrelevant right now.
00:50:25
At the very same time where people still unaware
00:50:28
complaining about the solar panels and wind turbines,
00:50:31
around the world, there is a whole new infrastructure
00:50:36
is being built silently.
00:50:38
It's happening silent.
00:50:39
You have to specifically research for fuel free
00:50:43
energy generators to find this infrastructure
00:50:46
to find these devices.
00:50:48
But this is what is happening.
00:50:50
It appears silently without the big coverage in the media.
00:50:54
But it is happening, just like OpenAI was developing
00:50:58
their AI technologies for over, what, almost a decade
00:51:03
without presence in the media.
00:51:05
This is what is happening in Kurdistan right now, the
00:51:08
small region on north of Iraq, just to give you one
00:51:11
example, but 400 megawatts station is being created over
00:51:16
there by German corporation that does not use fossil
00:51:19
fuels, that use the different kind of principles, the new
00:51:23
principles of generating energy.
00:51:26
And this is what is happening simultaneously
00:51:29
in the background.
00:51:30
But guys, shouldn't this be something that
00:51:32
has to be discussed?
00:51:33
Because if we have fuel free energy devices
00:51:36
already available.
00:51:37
Bunch of them were presented in Switzerland last year and it
00:51:40
was covered by Creative Society volunteers who went there,
00:51:44
who did the full coverage, who did interviews with people.
00:51:46
You have devices in Florida, that also inventors
00:51:50
have been interviewed.
00:51:51
Those are real devices.
00:51:53
Why it's not being spoken about?
00:51:55
Guys, why not?
00:51:56
If this is the most crucial topic.
00:51:58
Using these technologies, we could be securing our food
00:52:02
production, using vertical farming, freeing huge amounts
00:52:05
of land right now, and so on.
00:52:08
This is not being done.
00:52:09
We could build not desalination plants, again, Governor
00:52:12
Newsom said we're building 20 desalination plants right now.
00:52:17
This all is our future.
00:52:19
Guys, our future is atmospheric water generators.
00:52:22
We should be demanding those.
00:52:24
There is no public demand, and therefore, there is no response
00:52:27
from the governance, and this is a problem that we are not
00:52:31
articulating our demands.
00:52:32
Therefore, there is no response.
00:52:34
And again, just instead of being a couch potato,
00:52:38
you have to wake up.
00:52:39
You have to wake up, look around, and stop
00:52:42
waiting for something to miraculously change.
00:52:45
Each voice today can be amplified.
00:52:48
Each voice can be heard.
00:52:49
Just like over six months ago when Lahaina was on fire
00:52:53
and we've seen billionaires and multi millionaires
00:52:57
begging peasants for money.
00:52:59
That created a huge backlash.
00:53:01
One lady created this TikTok which went viral and have
00:53:04
been seen by millions of people and you know what?
00:53:07
That multi billionaire with over 500 million dollars in
00:53:12
wealth came to Las Vegas and he was booed by the public.
00:53:16
Public was booing him because people have good memory.
00:53:19
Fortunately, people do remember who this so called heroes
00:53:24
glorified by Hollywood are and how little humane left in
00:53:28
people whom we tried to reach.
00:53:31
Again, we've been inviting them to speak up, to make
00:53:34
a real difference, not to beg people who barely
00:53:37
making ends meet for money.
00:53:39
No, Creative Society is not interested in
00:53:43
collecting donations in any sort of financial gain.
00:53:47
And that is what's different about the thing we're
00:53:50
talking about over here.
00:53:50
We genuinely care about people, because we are people.
00:53:55
We are people just like you, and we want to live, we want
00:53:58
to survive in this climate apocalypse that is coming
00:54:03
up, that has already started.
00:54:05
The main climatologist of NASA, Gavin Schmidt, openly says we
00:54:09
did not take it into account.
00:54:11
We're seeing the models that include El Nino, unprecedented
00:54:15
warming, and still there is something that has not been
00:54:18
taken into account, and this something, we can tell openly,
00:54:22
this is the heating up of our planet from the inside.
00:54:25
This is exactly what is causing the ocean to overheat
00:54:28
and bombard California with unprecedented precipitation
00:54:32
one wave after another.
00:54:34
This is what exactly causing the tension inside of our planet
00:54:37
to build up and create the rise in cataclysms across the globe.
00:54:42
And we are seeing when in Kazakhstan, the TV stations
00:54:47
saying there is nothing to worry about there will be no major
00:54:50
earthquakes and one week after Kazakhstan biggest city Almaty
00:54:54
is being hit by devastating earthquakes and people simply
00:54:59
have to walk out and shocked freeze in the middle of the
00:55:03
night and afraid to even return to their houses and they start
00:55:06
asking why alarms were not sound just when Lahaina was
00:55:10
burning the very same thing no alarm were sound so people
00:55:13
wouldn't know that something bad is about to happen.
00:55:16
Yet, in Kazakhstan, they registered that earthquake with
00:55:19
epicenter hundreds miles away.
00:55:22
They could switch alarms on and urge people to leave
00:55:25
their houses, but their excuse was, yeah, they
00:55:28
totally messed up with this.
00:55:30
And they could clearly say, we messed up, we have to fix it.
00:55:34
No, they said it was done on purpose because we
00:55:36
didn't want to raise panic.
00:55:38
Well, right now, guys.
00:55:40
Not to raise panic.
00:55:41
You're being kept in silent.
00:55:43
The disaster is just around the corner.
00:55:45
Just like that earthquake in Almaty.
00:55:48
And they are not raising the alarm not to raise panic.
00:55:53
They are being silent and their silence will cost you your life.
00:55:59
If we want to avoid this reality, we have to stop being
00:56:02
couch potatoes, wake up and use this miraculous technology
00:56:06
of cell phones we're having in our hands to inform people
00:56:09
about what is happening.
00:56:10
Because government is too busy not sounding the
00:56:13
alarm, not to raise panic.
00:56:15
And this is the reality of it.
00:56:17
We have people who are sounding the alarm, but we don't have
00:56:21
means to amplify our call.
00:56:23
We're trying to use all possible means and there are heroes
00:56:26
across the globe who unify, who spread this information,
00:56:30
but this is too little.
00:56:32
Too little yet to wake people up.
00:56:34
And unfortunately, if that's going to continue like that,
00:56:36
we're going to see tremendous devastation in a number of human
00:56:41
lives taken by the climate.
00:56:43
And already it's taking more human lives.
00:56:45
And when I'm watching the presidential debates and
00:56:48
right now, we're seeing candidates over there
00:56:50
talking about October 7.
00:56:52
Still nobody talks about earthquakes in Afghanistan.
00:56:56
They only speak about other events and it's a shame that,
00:57:00
in addition to climate, which is doing much better job in
00:57:03
taking human lives where we're seeing what is happening.
00:57:06
Military wise that we are caught up and we are wasting time on
00:57:10
military operations and we have this huge scientific centers, we
00:57:16
have this institutions worldwide think tanks from Washington, D.
00:57:22
C.
00:57:22
to various places around the globe who are creating
00:57:26
their geostrategic planning.
00:57:28
Do we have a single institution that is forecasting unification
00:57:32
of the people and gives strategic means to unify
00:57:37
and build free society where artificial intelligence will
00:57:40
serve all people and create a safe, beautiful, prospering
00:57:45
world where our currency will not decline in its purchasing
00:57:49
power, where inflation will be impossible because AI can
00:57:52
create economy that will not see crisis, that will not
00:57:56
collapse, that will be not based on debt, on endless creation
00:58:00
of debt that we are seeing right now with 40 trillions
00:58:05
of dollars that we owe to somebody we don't even know who.
00:58:09
But we know that eventually it's going to collapse.
00:58:12
We just hope we're not going to be the ones under the
00:58:15
rubble of this fallen economy.
00:58:17
But this is inevitable and that is something we don't
00:58:20
want to recognize just like impending climatic disaster.
00:58:24
We have to be a little bit more wiser.
00:58:26
We have to look at least one step ahead.
00:58:28
You know, I came across this interesting dialogue,
00:58:30
was Bitcoin created by CIA?
00:58:33
And the person who really looked into the creation of
00:58:36
the code itself said that it was most probably insider in
00:58:41
National Security Agency because back before 2008, when it was
00:58:45
created, there was multiple ways to protect the backdoor
00:58:49
from entering the code to generate the random numbers.
00:58:54
Those backdoors, those already certified ways to create the
00:58:59
software were certified by all international agencies.
00:59:03
They would say that those are safe.
00:59:05
So the person who created the Bitcoin did not use any of
00:59:08
certified methods, which later occurred that NSA had the
00:59:14
backdoor to each one of them.
00:59:16
So it was insider who was inside of NSA and he knew
00:59:20
that he was he cannot use certified methods, because all
00:59:22
of them are being controlled.
00:59:24
National Security Agency, CIA, everybody else would
00:59:27
get in there, create endless amount of digital currencies
00:59:30
and could track everything down, but he did not use them.
00:59:33
He was smart enough to foresee what is happening.
00:59:36
Same thing with VPN, with all other technologies, they were
00:59:40
created by very smart people.
00:59:42
Originally to, again, create colorful revolutions around
00:59:45
the world, and you know how technologies have been used to
00:59:48
overthrow governments when there was a governmental revolution
00:59:52
arising in Egypt, for example, there was a famous thing when
00:59:56
top government officials called Twitter headquarters and said,
01:00:00
Twitter and Facebook, and they said, do not do your scheduled
01:00:05
maintenance this night, because there is a revolution
01:00:08
happening, and we don't want folks in Middle East to stay
01:00:11
without means to connect.
01:00:12
They have their Facebook revolution.
01:00:15
Don't do the scheduled maintenance tonight,
01:00:18
and the scheduled maintenance was delayed.
01:00:21
We're seeing many ways how the technology is being used
01:00:24
eventually against the people because CIA and everybody
01:00:27
else they fighting for world dominance, their plans are
01:00:30
outdated and this is something we don't realize too.
01:00:33
We have these huge machines working on plans for
01:00:37
of the past centuries.
01:00:39
They are not looking for the future.
01:00:40
They're not thinking of a future of unified humanity
01:00:43
with no dictatorship.
01:00:44
But eventually, it's exactly what each one of them wants.
01:00:47
Each one of the people involved in those think tanks for
01:00:51
themselves, they would want this beautiful world, but they're
01:00:54
just not caught up in this game state they keep playing which,
01:00:57
which simply make no sense.
01:00:59
So guys, let's stop being couch potatoes and stop waiting
01:01:03
for something beautiful to miraculously happen itself.
01:01:06
We have to spend a little bit of time both speaking
01:01:08
up to actually make our world a better place.
01:01:11
That is the most important thing today to start
01:01:14
thinking for yourself.
01:01:16
Alexey: Yes.
01:01:16
I want just to add that, people who you think will do miracles
01:01:20
for you, they don't have your best interest in mind.
01:01:23
Definitely not.
01:01:25
And if we want to see miracles in our life, we have
01:01:29
to create them ourselves.
01:01:31
And this is only possible when we unite our potential.
01:01:33
Otherwise, we know for sure because history shows that
01:01:38
only when people unite they can achieve great things.
01:01:42
This is what we have to do right now at this turning
01:01:45
point of our civilization.
01:01:48
We invite everyone again to join this wave of speaking up for
01:01:53
also your rights and your life.
01:01:56
At the end of the day, it's your life and the lives of your loved
01:02:00
ones which are at play today.
01:02:03
And I don't think that anyone will blame you for telling the
01:02:09
truth about what's happening with the planet and wanting
01:02:12
to save it for themselves as well for yourself.
01:02:16
So guys, let's continue this conversation and spread the
01:02:21
information about the true causes of what is happening
01:02:25
to the planet and about the way out that was also
01:02:28
voiced many times over.
01:02:30
So stay tuned until next time, be well.