Evaluate the text summarization capabilities of LLMs for enhanced decision-making on AWS
AWS » Machine Learning
by Dinesh Subramani
31s ago
Organizations across industries are using automatic text summarization to more efficiently handle vast amounts of information and make better decisions. In the financial sector, investment banks condense earnings reports down to key takeaways to rapidly analyze quarterly performance. Media companies use summarization to monitor news and social media so journalists can quickly write stories on developing issues. Government agencies summarize lengthy policy documents and reports to help policymakers strategize and prioritize goals. By creating condensed versions of long, complex documents, summa ..read more
Visit website
Enhance conversational AI with advanced routing techniques with Amazon Bedrock
AWS » Machine Learning
by Ameer Hakme
20h ago
Conversational artificial intelligence (AI) assistants are engineered to provide precise, real-time responses through intelligent routing of queries to the most suitable AI functions. With AWS generative AI services like Amazon Bedrock, developers can create systems that expertly manage and respond to user requests. Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon using a single API, along with a broad set of capabilities you need to build generat ..read more
Visit website
Improve LLM performance with human and AI feedback on Amazon SageMaker for Amazon Engineering
AWS » Machine Learning
by Yunfei Bai
20h ago
The Amazon EU Design and Construction (Amazon D&C) team is the engineering team designing and constructing Amazon warehouses. The team navigates a large volume of documents and locates the right information to make sure the warehouse design meets the highest standards. In the post A generative AI-powered solution on Amazon SageMaker to help Amazon EU Design and Construction, we presented a question answering bot solution using a Retrieval Augmented Generation (RAG) pipeline with a fine-tuned large language model (LLM) for Amazon D&C to efficiently retrieve accurate information from a l ..read more
Visit website
Improve accuracy of Amazon Rekognition Face Search with user vectors
AWS » Machine Learning
by Arik Porat
1d ago
In various industries, such as financial services, telecommunications, and healthcare, customers use a digital identity process, which usually involves several steps to verify end-users during online onboarding or step-up authentication. An example of one step that can be used is face search, which can help determine whether a new end-user’s face matches those associated with an existing account. Building an accurate face search system involves several steps. The system must be able to detect human faces in images, extract the faces into vector representations, store face vectors in a databas ..read more
Visit website
Accelerate ML workflows with Amazon SageMaker Studio Local Mode and Docker support
AWS » Machine Learning
by Shweta Singh
2d ago
We are excited to announce two new capabilities in Amazon SageMaker Studio that will accelerate iterative development for machine learning (ML) practitioners: Local Mode and Docker support. ML model development often involves slow iteration cycles as developers switch between coding, training, and deployment. Each step requires waiting for remote compute resources to start up, which delays validating implementations and getting feedback on changes. With Local Mode, developers can now train and test models, debug code, and validate end-to-end pipelines directly on their SageMaker Studio noteboo ..read more
Visit website
Significant new capabilities make it easier to use Amazon Bedrock to build and scale generative AI applications – and achieve impressive results
AWS » Machine Learning
by Swami Sivasubramanian
2d ago
We introduced Amazon Bedrock to the world a little over a year ago, delivering an entirely new way to build generative artificial intelligence (AI) applications. With the broadest selection of first- and third-party foundation models (FMs) as well as user-friendly capabilities, Amazon Bedrock is the fastest and easiest way to build and scale secure generative AI applications. Now tens of thousands of customers are using Amazon Bedrock to build and scale impressive applications. They are innovating quickly, easily, and securely to advance their AI strategies. And we’re supporting their efforts ..read more
Visit website
Building scalable, secure, and reliable RAG applications using Knowledge Bases for Amazon Bedrock
AWS » Machine Learning
by Mani Khanuja
2d ago
Generative artificial intelligence (AI) has gained significant momentum with organizations actively exploring its potential applications. As successful proof-of-concepts transition into production, organizations are increasingly in need of enterprise scalable solutions. However, to unlock the long-term success and viability of these AI-powered solutions, it is crucial to align them with well-established architectural principles. The AWS Well-Architected Framework provides best practices and guidelines for designing and operating reliable, secure, efficient, and cost-effective systems in the cl ..read more
Visit website
Integrate HyperPod clusters with Active Directory for seamless multi-user login
AWS » Machine Learning
by Tomonori Shimomura
3d ago
Amazon SageMaker HyperPod is purpose-built to accelerate foundation model (FM) training, removing the undifferentiated heavy lifting involved in managing and optimizing a large training compute cluster. With SageMaker HyperPod, you can train FMs for weeks and months without disruption. Typically, HyperPod clusters are used by multiple users: machine learning (ML) researchers, software engineers, data scientists, and cluster administrators. They edit their own files, run their own jobs, and want to avoid impacting each other’s work. To achieve this multi-user environment, you can take advantage ..read more
Visit website
The executive’s guide to generative AI for sustainability
AWS » Machine Learning
by Wafae Bakkali
3d ago
Organizations are facing ever-increasing requirements for sustainability goals alongside environmental, social, and governance (ESG) practices. A Gartner, Inc. survey revealed that 87 percent of business leaders expect to increase their organization’s investment in sustainability over the next years. This post serves as a starting point for any executive seeking to navigate the intersection of generative artificial intelligence (generative AI) and sustainability. It provides examples of use cases and best practices for using generative AI’s potential to accelerate sustainability and ESG initia ..read more
Visit website
Use Kubernetes Operators for new inference capabilities in Amazon SageMaker that reduce LLM deployment costs by 50% on average
AWS » Machine Learning
by Rajesh Ramchander
6d ago
We are excited to announce a new version of the Amazon SageMaker Operators for Kubernetes using the AWS Controllers for Kubernetes (ACK). ACK is a framework for building Kubernetes custom controllers, where each controller communicates with an AWS service API. These controllers allow Kubernetes users to provision AWS resources like buckets, databases, or message queues simply by using the Kubernetes API. Release v1.2.9 of the SageMaker ACK Operators adds support for inference components, which until now were only available through the SageMaker API and the AWS Software Development Kits (SDKs ..read more
Visit website

Follow AWS » Machine Learning on FeedSpot

Continue with Google
Continue with Apple
OR