GlobalTech.TV — Episode 13: Announcements from AWS reinvent 2024

GlobalTech.TV — Episode 13: Announcements from AWS reinvent 2024
GlobalTechTV
GlobalTech.TV — Episode 13: Announcements from AWS reinvent 2024

Dec 06 2024 | 00:19:48

/
Episode 13 December 06, 2024 00:19:48

Hosted By

Ariel Munafo Eyal Estrin Raz Kotler

Show Notes

A podcast about cloud adoption and cybersecurity.

Website: https://www.globaltech.tv/

 

Social networks: https://linktr.ee/globaltechtv

View Full Transcript

Episode Transcript

[00:00:02] Speaker A: Hello everyone and welcome to another episode of Global Tech TV Podcast and today we are trying something new. We are recording a special episode at the end of the AWS Reinvent 202524 sorry and looking ahead for the next year. But Jal, you always follow this event. You are a big geek of cloud and I will let you start. [00:00:31] Speaker B: Thank you. So as this is a special event I even wore my T shirt that I got in one of the previous reinvent. Maybe it was in 2019 it was for Amazon certified people. Anyway, before I'm talking about what were the announcements? Part of them general availability general part of them are preview what were the announcements that I believe were the most let's call it cool or amazing announcements? I would like to share something personal. As you mentioned, I've been watching the keynotes from Re Invent in the past several years now I've attended two of them a few years ago. And my personal recommendation for anyone who is either a developer or enjoys learning about new technology, the deep, deep insights of technology, I always encourage you to listen to or watch the recording of Dr. Werner Vogels, the Amazon CTO. His keynotes they always share a lot of insights from how Amazon builds their systems, things that are not really openly shared. And last year he was talking about the frugal architect, mainly talking about how to put cost as part of the requirements for a new system when we develop a new system. And this year he talks about a term called simplexity like on one side, simple on one side pretty complex systems that were developing over the years. He shared in his lecture last night lessons about complexity. How do you, what are you, what are your warning signs when you have a system? How do you detect it's way too complex and how to break it down and how to build what is called evolvable systems, meaning a lot of context of for example modeled in business concept, hidden internal details, fine grained interface. There are many concepts when we're building evolvable systems and this is like the future of how to build modern applications. So I highly recommend you to look for the video in YouTube. So this is my 10 cents before. [00:03:00] Speaker A: I can talk a lot about cloud, but I just want to say that I try to build some kind of managed service and I call it Simplex because the cloud has a lot of benefits. But I think because the rich, the amount of services and the amount of features it became sometimes really difficult to manage, really difficult to decide. So I will try to find it because I was talking it before the CTO of Amazon. But I believe that it's so that we need to think about doing the things simpler than, you know, just doing the cool stuff, because at the end we need to manage these applications and this infrastructure. So. [00:03:51] Speaker B: And by the way, the bottom line is not just deploying or developing a system, it's also how to do so in a very complex way in a high scale. Maybe it's not relevant for all customers or for all organizations, but there are many organizations developing very highly scalable system. Just consider about Netflix for example, the complexity in their system, Facebook, whatever you can think of. They're very complex systems. So a lot of insights. I highly recommend you to watch this lecture. So moving on. There are many, many announcements in this week in Reinvent 2024. I've specifically selected three of them from completely three domains. So the first one is called the next Generation of Amazon SageMaker AWS has announced the next generation of Amazon SageMaker, a unified platform for data analytics and AI. The new version integrates various AWS machine learning and analytics capabilities, offering a collaborative environment for teams to build faster using familiar AWS tools. Some of the key features in this specific announcement or integration of services. The Amazon SageMaker Unified Studio, which is currently just released in preview. It's a single development environment that combines functionality from various AWS services like Amazon EMR glue, Redshift, Bedrock, the machine learning and existing SageMaker studio. Another feature is the Amazon SageMaker Lakehouse again another capability. As far as I can recall it was announced and it's currently under ga. It's an open data architecture that reduce data silos and unified data across S3 data lakes, redshift data warehouse and third party sources. It supports Apache Iceberg compatible tools and engine. And another feature of the next generation of Amazon SageMaker is the Amazon SageMaker Data and AI Governance which includes Amazon SageMaker Catalog built on Amazon Datazone, enabling secure discovery, governance and collaboration on data and AI workflows. And in the show notes of this episode we'll also add some links so you can read more information about this new announcement. [00:06:24] Speaker A: And it is interesting because the SageMaker was launched in 2017 so after seven years. [00:06:31] Speaker B: Okay, so now looking like a different way of using this tool let's say. Okay, not just having different silos of machine learning or AI tools and data analytics. Now let's say let's combine everything to a unified service with all the sub family. I believe they're also calling it Amazon SageMaker AI. I saw this also this week someplace. [00:07:00] Speaker A: So this news was around AI? [00:07:03] Speaker B: Yeah, just Kidding. No, no. It was announced as part of both AI and data analytics. So the next announcement that I would like to share about is the Amazon S3 metadata. This is a new capability of Amazon S3 which is currently in preview. The S3 metadata is a new feature designed to enhance data management and discovery in Amazon S3. This feature aims to simplify metadata management, improve data discovery and enhance the Overall utility of S3 data for various applications including business analytics and real time interface. The S3 metadata automatically captures and updates metadata from objects as they're uploaded into the bucket, creating a queryable read only table that reflects changes within minutes. Metadata tables are stored in S3 tables, a new storage offering optimized for tabular data. By the way, it was also announced in GA in re invent and S3 metadata integrates with Amazon Bedrock, their machine learning allowing for annotation of AI generated content with metadata specifying its origin, creation timestamp and the model used. So for this announcement I'll also share in the show notes some additional reference materials you can read about and the last service or evolvement of an existing service is called the Amazon Aurora DSQL I saw people are talking about how do you pronounce it? D SQL dsql So let's just call it dsql. AWS has announced the preview it's currently in preview of Amazon Aurora dsql, a new serverless distributed SQL database. By the way, it's postgres compatible and designed for high availability and scalability. Among the features of the Aurora DSQL are virtually unlimited horizontal scaling, meaning adding more compute nodes, independent scaling of read writes, compute and storage. It has the fastest distributed SQL read and writes compared to the regular Aurora SQL. It has no single point of failure, Active Active distributor architecture and the Killing feature. Its strong consistency and durability for all read and writes to any regional endpoint. So so far you always had the limitation. You could create a multi region database or using global tables, but you always has an issue when you're working over long distance between two regions. So usually the way data was written was eventual consistency. What it means let's say a customer updates a record in a database and another customer would like to read this information. It will take a few seconds depending on the amount of data and the bandwidth between them for the data to be updated and replicated among all copies of the data, among the storage and everything else. But the new announcement that Amazon did was something that was I haven't seen it in the past the ability for strong consistency Meaning data is being automatically synchronized in a I'm guessing less than a second of of time it takes to sync between the different nodes and they managed to do it using a mechanism of time synchronization which basically is built on top of satellites. And every EC2 instance since last year has a time clock that is synchronized automatically all across all regions worldwide, in all the data centers. This allows the ability of the strong consistency. So lastly, Aurora DSQL aims to simplify the development of always available application with high scalability and reliability, offering an effortless scaling and resiliency solution for database management. And for those of you who already been using the Amazon Aurora the previous generation, let's call it the main differences between the Aurora SQL the Aurora DSQL it enhances distributed architecture which allows greater scalability and availability compared to the original Aurora. While Aurora is already a high performance option. Aurora DSQL takes this further by offering virtually unlimited horizontal scaling and higher availability guarantee making it more suitable for extreme demanding mission critical application that requires maximum scalability and uptime. So very long sentence that basically says now we have a strong consistency, now we have less than a second of time sync of data between different regions. [00:12:22] Speaker A: But as a former DBA it's amazing the opportunities that the possibilities that you have in the database before 10 years we had the 15 years maybe we have Oracle in SQL server that was 90, 95% of the market and now even in Amazon you have 20 or 15 databases and every year comes something new. So what can I advise is to be smart choosing the right database and look at the price of them, you know, because people like the better but you need to be if reaching the formal vocal keynote about the architecture is really smart to plan ahead, let's say and look at the money, the price. [00:13:12] Speaker B: Definitely look at both the vendor documentation, by the way it's relevant for all cloud providers and also look at the price list and only then try to understand if this is the right solution for your workload. [00:13:26] Speaker A: A always explain better than me. Angie, what about you? Any news? Any updates? [00:13:33] Speaker C: Absolutely. I would love to keep my updates as usual. Very positive how we can better bring security and secure all our systems. Mainly most of my announcements are going to be related towards security. One of them is towards kubernetes. I think that all of the experts in the industry should be a bit scared from this announcement because we might be jobless after what Amazon has done this year. And I would also love to wander upon a bit VPC and perhaps Amazon Security Lake for the past two years I really been interested on that. So without further ado, I'd love to dive upon Kubernetes. So Amazon has one of their best services, EKS Automotive. And this year Amazon just made Kubernetes a whole lot easier and safer with EKS Automotive. What is this feature and what is this announcement about? So basically this feature automates cluster management, which not only saves time, but also minimizes the risk of those pesky misconfigurations that can often lead to security issues. I think that this is perfect, especially for teams that do not have Kubernetes experts and it really gives them a way to deploy apps confidently and securely. I think that for everyone that's looking to simplify operations while staying secure, this is definitely a big win. On the other side we have VPC Lattice with Private Link enhancements and Amazon has yet brought another surprise this year on the Re Invent event. This update is really huge for security for every professional that's working with multi account setups. Basically VPC Lattice now supports TCP connections with Private Link. This is something new, innovative and it's going to really benefit a lot of professionals in security by making their life easier easier to secure and securely share their resources across accounts and VPCs. Also, if I can put it in perspective for people who are not really technical, I would like to compare it as if it's like having a fortified highway for your data where you can secure, streamline and build for modern cloud architectures. I think that if all the professionals and enthusiasts in cloud, and especially security are trying to manage complex networks, this feature is definitely a breath of fresh air for both assuring flexibility and security. Last but not least, from the big announcements of this year's reinvent in security, we have Security Lake. Amazon Security Lake is ready with their new specialization and for everyone wondering or for everyone who's tired of juggling multiple security tools and dashboards on their daily workflow, I think that Amazon Security Lake is our new best friend. You'd like to ask why is that? How can we Angie? [00:16:49] Speaker A: Why is that? [00:16:53] Speaker C: A little question? [00:16:56] Speaker A: No, just one comment. [00:16:57] Speaker B: As far as I can recall, the Amazon Security Deck was announced last year. As far as I can recall I. [00:17:04] Speaker C: Was also part when that was announced live. I was part of AWS Israel there watching it live as all the professionals were explaining it. It was really exciting. But this year it came with another, let's say, update, a new feature that I think this feature is really going to bring all the security data in one place. This is really efficient and it's going to make threat detection faster and more efficient. I think it's like another comparison I think that it would really suit this new announcement is that it's like having a central hub for all your defenses. It can ensure you that you can react to threats before they even escalate. It's really important, especially for professionals that are on the defense part trying to prevent all the messiness before it escalates. So I think that for anyone that's serious about proactive security, this feature is definitely a must have and Security Lake is a service that we should really take advantage of. [00:18:09] Speaker A: Right? [00:18:10] Speaker B: Amazing. [00:18:11] Speaker A: I I finally managed to I don't know if understand and think about it that I am not a tech person anymore. I do it with I'm sad to say that let's say but what I did in this re invent is look at the big picture and just was a post from in LinkedIn of Matt Garment, AWS CEO. I hope that I say this name right and they did like a small mini site with some kind of post around the Vogel predictions in the next in the next year in 2025. So we will also add this there are five points there if I remember right. And and you know he's a smart guy, he has a lot of clients they look the future. So I think that is really interesting to see and follow what he writes. So I hope that next year we will all be in Vegas in this time but if not we will do also the recap next year. I can try and promise that. So thank you everyone. Thank Angie, thank Yalu and as always everybody is welcome to follow us in all the places that we are in every social media, global, tech, tv. Feel free to write ask and let's talk later let's say so thank you everybody and until the next one bye.

Other Episodes