Summary of AWS blogs for the week of monday Mon Jul 24

In the week of Mon Jul 24 2023 AWS published 105 blog posts – here is an overview of what happened.

Topics Covered

Desktop and Application Streaming

What is Desktop and Application Streaming?

Desktop and Application Streaming is the process of hosting virtual desktops or applications in the cloud for remote access. This gives users the flexibility to access their applications and desktops from any location with an internet connection. Additionally, with cloud-based solutions, organizations can avoid the need for costly hardware upgrades and maintenance, as well as minimize their CAPEX costs.

Build a Customizable Virtual Desktop Infrastructure Portal with NICE DCV

Virtual Desktop Infrastructure (VDI) solutions are becoming increasingly popular due to their ability to provide efficient and flexible access to remote desktop environments from anywhere. Amazon Web Services (AWS) offers fully managed End User Computing (EUC) solutions such as Amazon Workspaces and Amazon AppStream 2.0, which can be used to host and manage virtual desktops and applications. NICE DCV is a virtual desktop solution managed by AWS that enables customers to build their own customized VDI portal quickly and easily.

NICE DCV provides a comprehensive suite of features, including single sign-on authentication, connection brokering, and user management. Additionally, NICE DCV allows customers to customize the user experience with their own branding and styling. This enables customers to quickly and easily create an intuitive and secure environment for their end-users.

Provide Hands-On Technical Experience at Parkway Schools with AppStream 2.0

K-12 organizations have been implementing alternative methods to help students gain experience with career and technical education (CTE) and STEM curriculum. Amazon Web Services (AWS) offers a solution for this purpose, called AppStream 2.0. This is a fully managed application streaming solution that can be used to stream and manage applications from the cloud.

AppStream 2.0 enables K-12 organizations to provide their students with hands-on technical experience. It makes it easy to set up a secure, reliable, and scalable application streaming environment for students in any location. With AppStream 2.0, students can access their applications from any device, including Chromebooks, iPads, laptops, and desktop PCs. Furthermore, AppStream 2.0 is easy to deploy, and can be used to stream any existing Windows application.

Using Amazon CloudWatch to Analyze User Access to Amazon WorkSpaces

Amazon CloudWatch is a service provided by Amazon Web Services (AWS) that enables customers to monitor and analyze their AWS resources, such as Amazon WorkSpaces. Customers using Amazon WorkSpaces can use Amazon CloudWatch to gain additional insight into how their users are connecting to Amazon WorkSpaces. Amazon CloudWatch can be used to track and report on IP addresses, platforms, and client versions used to access Amazon WorkSpaces.

At the AWS EUC New York Summit on July 26th, AWS will be hosting a builders session (EUC201) on how to use Amazon CloudWatch to analyze user access to Amazon WorkSpaces. This session will provide customers with the resources and guidance needed to set up and monitor their Amazon WorkSpaces environment.

How KeyCore Can Help

KeyCore provides professional and managed services related to Desktop and Application Streaming. We have a team of experienced AWS professionals who can help your organization leverage the latest AWS EUC offerings, such as Amazon Workspaces and Amazon AppStream 2.0. We can help you build and manage a secure, reliable, and scalable application streaming environment. In addition, our experts can also help you set up and monitor your Amazon WorkSpaces environment using Amazon CloudWatch. Contact us today to learn more about how we can help.

Read the full blog posts from AWS

Official Machine Learning Blog of Amazon Web Services

Unlock the Power of Generative AI with Stable Diffusion XL and Amazon SageMaker JumpStart

Introducing Stable Diffusion XL 1.0

Today, Amazon SageMaker JumpStart makes Stable Diffusion XL 1.0 (SDXL 1.0) available to customers. SDXL 1.0 is the latest image generation model from Stability AI, and it is designed for professional use. It has been calibrated for high-resolution, 1024-pixel image generation at a variety of aspect ratios.

Flag Harmful Language in Conversations with Amazon Transcribe Toxicity Detection

Social activities such as online gaming or social networking have seen an increase in hostile or aggressive behavior. This includes hate speech, cyberbullying, and harassment. Online gaming communities often use voice chat to facilitate communication among their users, but this can lead to unwanted conversations. Amazon Transcribe Toxicity Detection flags these conversations, so users can take action.

Maximize Stable Diffusion Performance and Lower Inference Costs with AWS Inferentia2

Generative AI models are becoming increasingly popular due to their capabilities in creating realistic text, images, code, and audio. Stable Diffusion models stand out for their ability to create high-quality images based on text prompts. These models can generate a wide variety of images, such as landscapes, portraits, or abstract art. To lower inference costs, AWS Inferentia2 offers a cost-effective solution.

Planning an AI Strategy with New Guides from AWS

Recent developments in AI and ML have made it difficult for consumers and organizations to assess what technology is available and how to leverage it. To help with this, AWS now offers new guides on AI, ML, and generative AI. This provides users with the necessary information they need to create an AI strategy for their organization.

Develop Your Generative AI Knowledge with a Free Course

To expand on this knowledge, AWS is now offering a free course on Generative AI Foundations on AWS. This provides users with the conceptual fundamentals, practical advice, and hands-on guidance to pre-train, fine-tune, and deploy state-of-the-art models on AWS and beyond.

AWS Reaffirms its Commitment to Responsible Generative AI

AWS is committed to developing and deploying generative AI responsibly. With a dedicated responsible AI team, AWS is focused on ensuring that generative AI is used to benefit society, and they remain committed to the responsible use of this technology.

Generative AI Foundation Models with No Internet Connectivity Using Amazon SageMaker JumpStart

Generative AI is becoming popular in many industries for solving specific business problems. This type of AI creates content and ideas such as conversations, stories, music, images, and videos. Now, with no internet connectivity, Amazon SageMaker JumpStart can be used to deploy generative AI foundation models in VPC mode.

How Patsnap Used GPT-2 Inference on Amazon SageMaker

You are likely familiar with the autocomplete suggestion feature when you search for something on Google or Amazon. To provide this feature, Patsnap used GPT-2 inference on Amazon SageMaker. This allowed them to provide low latency and cost services.

Optimize AWS Inferentia Utilization with FastAPI and PyTorch Models on Amazon EC2 Inf1 & Inf2 Instances

When deploying Deep Learning models at scale, it is important to make use of the underlying hardware to maximize performance and cost benefits. To do this, it is important to consider the Amazon Elastic Compute Cloud (EC2) instance, model serving stack, and deployment architecture. FastAPI and PyTorch models on EC2 Inf1 & Inf2 instances provide an efficient way to optimize AWS Inferentia utilization.

Unlock the Power of Generative AI with KeyCore

Organizations can use the above tools and services from AWS to unlock the power of Generative AI. KeyCore is the leading Danish AWS consultancy. We provide professional services and managed services to help organizations understand, adopt, and drive innovation with AWS. Our team of experts will work with you to build a successful Generative AI strategy, and develop a cost-effective and efficient deployment architecture. Contact us today to find out how we can help you unlock the power of Generative AI.

Read the full blog posts from AWS

Announcements, Updates, and Launches

AWS Publishes a Series of Updates and Launches

Charge for Public IPv4 Addresses Effective February 1, 2024

AWS is introducing a charge for public IPv4 addresses starting February 1, 2024. There will be a charge of $0.005 per IP per hour for all public IPv4 addresses, even those not attached to a service. Currently, there is already a charge for un-attached public IPv4 addresses allocated in accounts. This new charge applies to all AWS customers, and KeyCore can help customers analyze their usage patterns and adjust their usage accordingly.

New Amazon EC2 Instances Powered by AWS Graviton3 Processor

Amazon EC2 C7g, M7g, and R7g instances powered by AWS Graviton3 processors are now available. These instances deliver up to 25 percent higher performance, up to two times higher floating-point performance, and up to two times faster cryptographic workload performance compared to AWS Graviton2 processors. KeyCore can help customers assess their workloads and determine which instances are the best fit for their needs.

New AWS Local Zone in Phoenix, Arizona

AWS has opened a new Local Zone in Phoenix, Arizona, with more instance types, storage classes, and services than ever before. Local Zones provide customers with the benefit of AWS services that are physically close to their users, enabling them to reduce latency with applications that require high speed networking. KeyCore can assist customers in determining if a Local Zone is the best fit for their applications.

Amazon EC2 P5 Instances Powered by NVIDIA H100 Tensor Core GPUs

AWS and NVIDIA have collaborated to launch Amazon EC2 P5 instances powered by NVIDIA H100 Tensor Core GPUs. These instances are optimized for training large language models and developing generative AI applications. KeyCore can help customers deploy and manage these instances, as well as assess their applications and use cases to determine if P5 instances are the best fit.

AWS Entity Resolution

AWS Entity Resolution helps organizations match and link related records from multiple applications and data stores. As organizations grow, the records that contain information about customers, businesses, or products, tend to be increasingly fragmented and siloed. KeyCore can assist customers in leveraging AWS Entity Resolution, as well as build and manage the applications that integrate with it.

Enable Foundation Models to Complete Tasks with Agents for Amazon Bedrock

Amazon Bedrock is a fully managed service that makes foundation models from Amazon and leading AI startups available for building with generative AI on AWS. KeyCore can help customers evaluate, deploy, and manage their machine learning deployments with Amazon Bedrock.

Top Announcements of the AWS Summit in New York, 2023

Generative AI and machine learning were the stars of the show at the 2023 AWS Summit in New York. Other announcements included new services, products, and technologies, all of which can be leveraged with the help of KeyCore.

AWS Week in Review

The AWS launch machine is running at full speed, and the week of July 24, 2023, saw the release of several new services and products. These included Redshift + Forecast, CodeCatalyst + GitHub, Lex Analytics, Llama 2, and much more. KeyCore can assist customers to evaluate and deploy these new services and products.

Read the full blog posts from AWS


Securing Containers for Your Cloud-Native Applications

Container solutions allow developers to deploy cloud-native applications quickly and effectively. However, containers can be vulnerable to security threats if not managed properly. In this post, we’ll look at how to shift your focus left to secure your container supply chain and ensure your applications are securely running in the cloud.

Application First Delivery on Kubernetes with Open Application Model

Kubernetes is the most popular container orchestration technology on the market today. But in order to get the most out of Kubernetes, users must understand application deployment configurations, like Deployments, Services, and Ingress. The Open Application Model (OAM) is a specification that makes it easier to define and deploy cloud-native applications on Kubernetes.

Building Better Container Images

Container images are the building blocks of microservice architectures. They are responsible for packaging applications for deployment and running in the cloud. Therefore, it is important to ensure that container images are built with security in mind. This post looks at how to build better container images for secure cloud deployments.

Accelerate Amazon ECS-Based Workloads with Blueprints

The Amazon Elastic Container Service (ECS) is a powerful platform for running and managing containers. With the introduction of ECS Blueprints for AWS Cloud Development Kit (AWS CDK), it’s easier and faster than ever to build container workloads for the Amazon ECS. This post explains how ECS Blueprints makes it easier to build and manage containers on ECS.

Implementing Application Load Balancing of Amazon ECS Anywhere Workloads Using Traefik Proxy

Amazon ECS Anywhere provides a platform for running and managing containers on customer-managed infrastructure. But how do you ensure that applications running on Amazon ECS Anywhere are optimized for performance and availability? This post looks at how to use Traefik Proxy to implement application load balancing for Amazon ECS Anywhere workloads.

At KeyCore, we are experts in cloud-native technologies like Kubernetes, Docker, and Amazon ECS. Our team of experienced cloud architects and engineers can help you get the most out of your cloud-native applications. Contact us today to learn more about how we can help you secure your container supply chain and accelerate your container-based workloads.

Read the full blog posts from AWS

AWS Quantum Technologies Blog

Graph Coloring with Physics-Inspired Graph Neural Networks

In this post we provide an overview of how physics-inspired graph neural networks can be used to solve the graph-coloring problem, which is notoriously hard, but can help in a huge number of resource-allocation problems. The graph-coloring problem can be used to solve familiar problems like sports scheduling and rental car management.

What is Graph Coloring?

Graph coloring is a combinatorial optimization problem. The goal is to assign colors to certain elements of a graph such that no two adjacent elements are of the same color.

For example, in the graph below, each graph node might represent a college and each edge might represent a connection between two colleges. The goal of the graph-coloring algorithm is to assign a color to each college to represent the college’s “identity.” The challenge is that each college must have a unique color and no two adjacent colleges should have the same color.

Physics-Inspired Graph Neural Networks

Physics-inspired graph neural networks (GNNs) are a type of artificial neural network that is capable of understanding the structure of a graph. GNNs can be used to solve graph-coloring problems by finding the optimal color assignment for the nodes of the graph. GNNs are particularly useful for solving large-scale graph-coloring problems, because they are capable of scaling to large datasets.

Applications of Graph Coloring

The graph-coloring problem can be used to solve a variety of familiar resource-allocation problems. For example, it can be used to solve sports scheduling problems, where certain teams must play each other on certain days. It can also be used to manage rental car fleets, where each car is assigned a unique color to represent its identity. The graph-coloring problem can also be used to solve problems in the telecommunications and computer networking industries.

KeyCore Can Help

At KeyCore, we can help you understand how physics-inspired graph neural networks can be used to solve the graph-coloring problem. Our team of experienced AWS consultants can provide you with the expertise you need to develop and implement powerful machine learning solutions. We can also provide support and guidance on how best to develop and deploy GNNs in AWS, using CloudFormation YAML, AWS API Calls and AWS SDK for JavaScript v3. Contact us today to learn more about how we can help you get the most out of your machine learning projects.

Read the full blog posts from AWS

AWS Smart Business Blog

The manufacturing sector accounts for a significant portion of the global economy, and small and medium businesses must find sustainable ways to produce their products in order to remain competitive. Additionally, the recent pandemic has increased the demand for telehealth services, which has brought new challenges to the industry. In this blog post, we explore how small and medium businesses can use the cloud to overcome the challenges of sustainable manufacturing and how cloud IT services have helped a small telemedicine company meet increased patient demand.

Sustainable Manufacturing in the Cloud

The cloud offers businesses an opportunity to become more efficient and reduce their environmental footprint. By utilizing cloud services, companies are able to leverage the latest technologies such as software and devices that can monitor their equipment in order to gain valuable insights. Additionally, the cloud enables businesses to optimize their processes, minimize waste, and ultimately reduce their overall energy consumption.

Cloud IT Services for Telemedicine

When the COVID-19 pandemic hit, demand for easy-to-access telehealth services skyrocketed. Fortunately, Argentina-based Llamando al Doctor had already integrated Amazon Web Services (AWS) into their platform, allowing them to quickly scale to meet this increased demand. With AWS, the company was able to store and process large amounts of patient data, as well as provide a secure and reliable network for their customers.

KeyCore Can Help

At KeyCore, we are experts in cloud computing and can help you get the most out of your cloud investments. We provide professional and managed services to help you transition to the cloud and maximize its potential. Our team of professionals has extensive experience in deploying services such as AWS, and we can help you develop a cloud strategy that meets the needs of your business. With our help, you can create a more sustainable and efficient manufacturing process for your business.

Read the full blog posts from AWS

Official Database Blog of Amazon Web Services

Migrating from IBM DB2 z/OS to Amazon RDS for MySQL and Amazon Aurora MySQL

Migrating a database from IBM DB2 z/OS to Amazon Relational Database Service (Amazon RDS) for MySQL or Amazon Aurora MySQL-Compatible Edition is a complicated process that includes numerous steps, such as assessment, schema conversion, data migration, functional testing, and performance tuning. To help streamline this process, the AWS Schema Conversion Tool (AWS SCT) can automate the assessment and database schema conversion steps.

Introducing Amazon Managed Blockchain Access and Query

Amazon Managed Blockchain (AMB) now offers two new services, Amazon Managed Blockchain (AMB) Access and Query, to help developers quickly and securely build scalable applications. AMB Access provides a serverless offering for non-mining, full blockchain nodes. With AMB Access, developers can use their favorite blockchain tools, and it supports Hyperledger Fabric and Ethereum. AMB Query allows developers to asynchronously query block data with just a few lines of code.

The Role of Vector Datastores in Generative AI Applications

Generative AI has revolutionized various industries with its ability to answer questions, write stories, create art, and generate code. AWS customers may want to use generative AI in their own businesses and have a wealth of domain-specific data (financial records, health records, etc.) to back it up. Vector Datastores provide efficient data representation for large-scale machine learning workloads, allowing developers to store and process vast amounts of data and machine learning models with minimal effort. Vector Datastores can also enable developers to quickly process complex datasets and serve inference requests without the need for specialized hardware.

Building Centralized Audit Data Collection for Amazon RDS for PostgreSQL with Amazon S3 and Amazon Athena

Organizations are often required to capture, store, and retain audit data to meet their compliance requirements and information security regulations. To do so, this post shows how to capture and store audit data from Amazon RDS for PostgreSQL using Amazon S3 and Amazon Athena. Amazon S3 is used as the data lake, and Amazon Athena is used to query and analyze the audit data in S3. This helps organizations to gain insights into their audit data and meet their compliance requirements.

Using Oracle Database Gateway to Connect Amazon RDS Custom for Oracle to SQL Server

Amazon Relational Database Service (Amazon RDS) Custom for Oracle allows users to run Oracle databases on AWS, and customize the underlying server and operating system configurations to fit their needs. To connect Amazon RDS Custom for Oracle to SQL Server, developers can use Oracle Database Gateway. This lets them access data in a SQL Server database from an Oracle database, and also allows for data replication, distributed transactions, and remote procedure calls between both databases.

At KeyCore, we help organizations migrate their databases from on-premises solutions to the cloud. Our experienced team of AWS certified professionals can provide you with the necessary guidance and expertise to make the migration process easier and more efficient. We can also assist you in setting up and configuring Amazon RDS Custom for Oracle and Oracle Database Gateway for secure and seamless connectivity to SQL Server. Contact us today to learn more about how we can help you leverage AWS to meet your needs.

Read the full blog posts from AWS

AWS for Games Blog

How Oculus Studios Division Is Reducing Game Development Time and Costs with Amazon GameLift

Meta’s Oculus Studios is advancing virtual reality (VR) storytelling through the publishing of popular multiplayer games, such as “Population One,” “Onward VR,” and “Beat Saber.” To continue the evolution of their VR storytellers, Oculus Studios has established a reliable and scalable game development and hosting backend for their teams to tap into – Amazon GameLift.

Amazon GameLift Supports Oculus Studios

Amazon GameLift offers a range of services, from game hosting that allows developers to reduce their operational costs, to game matchmaking which can be used to create a great player experience. Oculus Studios has used the game hosting service to deploy their games across multiple regions, enabling their players to experience low-latency gaming without any setup or maintenance.

By using Amazon GameLift, Oculus Studios has seen a reduction in the time it takes to deploy games. Before Amazon GameLift was used, Oculus Studios had to manually set up and manage game servers around the world. This process was both time consuming and expensive.

By letting Amazon GameLift handle the hosting, Oculus Studios can now launch games quickly and cost-effectively. This has allowed the team to focus on developing their games, resulting in a better experience for their players.

KeyCore Can Help

At KeyCore, our team of AWS experts can help you reduce game development time and costs by leveraging Amazon GameLift. Our team can provide you with the necessary support, from technical advice to managed services, to ensure that your games can be deployed quickly and with minimal effort.

Our team of experts can also provide you with guidance on other AWS services that can help you develop games faster and at a lower cost. With our team’s help, you can take advantage of the features and services that Amazon offers to ensure that your game is successful.

Read the full blog posts from AWS

AWS Training and Certification Blog

Grow Your Cloud Skills with AWS Training and Certification in July 2023

This July, AWS Training and Certification released 52 digital training products on AWS Skill Builder, designed to help individuals in developer, cloud architect, cloud engineer, and cloud administrator roles build their cloud skills. Highlights include AWS Industry Quest: Healthcare, AWS Cloud Quest: Networking, 19 AWS Builder Labs, and new exam prep for AWS Certified Cloud Practitioner and AWS Certified DevOps Engineer – Professional. Additionally, two new trainings exclusive to AWS Partners in sales roles were also made available.

19 New AWS Builder Labs

These 19 new AWS Builder Labs are designed to reinforce cloud learning across various topics including Containers, Developing, Serverless, Networking, Security and AWS Well-Architected Framework. Each lab includes a helpful video overview, a detailed lab guide, and an assessment to help users measure their progress and understanding.

Building Serverless Applications on AWS

For developers and software engineers looking to gain the foundational knowledge and skills to develop in the AWS Cloud and understand the AWS services, there is the new three-course series “Building Serverless Applications on AWS”. This series is available on both Coursera and edX, and is a great way to get started building in the cloud.

5 Steps to Learn AWS Best Practices and Become Well-Architected

Whether you’re just getting started with AWS or looking to sharpen your current skills, the AWS Well-Architected Framework is a powerful tool to help you build and manage your cloud-based infrastructure. Here are five tips and associated resources to help you get started and learn best practices:

  • Understand the value proposition of the Well-Architected Framework.
  • Learn the phases of Well-Architected Framework Review.
  • Get familiar with the AWS Well-Architected Tool.
  • Gain knowledge of the five pillars.
  • Take the new training course from AWS Training and Certification to build your practical knowledge.

After completing the assessment, users can earn a digital badge.

How KeyCore Can Help

At KeyCore, our team of AWS-certified consultants can help you start learning and using the AWS Well-Architected Framework. From educational sessions to hands-on workshops, we can help you review your existing architecture and design principles, and assist with best practices for building and managing your cloud infrastructure. Contact us today to learn more.

Read the full blog posts from AWS

Microsoft Workloads on AWS

Modernizing a Microsoft Workload to AWS: Challenges and Lessons Learned

For companies that wish to modernize their Windows Communication Framework (WCF) service to CoreWCF, there are certain challenges and lessons to be learned. CoreWCF is a port of the server side of WCF to the .NET Core framework and is supported by Microsoft. Contributing to this technology is relatively straightforward.

Rotate Active Directory Credentials

To keep passwords and other credentials stored in Microsoft Active Directory synchronized with AWS Secrets Manager, it is possible to use AWS Systems Manager (SSM) Automation. This feature encrypts the secrets with an AWS Key Management Service (KMS) customer managed key (CMK) for added security.

Synchronizing Active Directory Users to AWS IAM

For customers who already have an established Active Directory Federation Service (ADFS) implementation, it is possible to leverage it for federated access to AWS. To achieve this, PowerShell can be used to synchronize changes to users and groups in Microsoft Active Directory, using Simple Cloud Identity Management (SCIM).

At KeyCore, we are experts in AWS, and can help with modernizing Microsoft workloads by providing professional services and managed services. Our advanced team of AWS consultants can provide the expertise needed to ensure your workloads are transitioned smoothly and securely.

Read the full blog posts from AWS

Official Big Data Blog of Amazon Web Services

A Side by Side Comparison of Apache Spark and Apache Flink for Common Streaming Use Cases

Apache Spark and Apache Flink are both open-source, distributed data processing frameworks used for big data processing and analytics. Spark is well-known for its ease of use, high-level APIs, and the ability to process large amounts of data. Flink offers superior performance when processing data streams in real-time and with low-latency stateful applications.

Extend Your Data Mesh with Amazon Athena and Federated Views

Amazon Athena is an interactive analytics service built on the Trino, PrestoDB, and Apache Spark open-source frameworks. It allows users to run SQL queries on petabytes of data stored in Amazon Simple Storage Service (S3) in popular formats such as Parquet and open-table formats like Apache Iceberg, Apache Hudi, and Delta.

Simplify External Object Access in Amazon Redshift Using Automatic Mounting of the AWS Glue Data Catalog

Amazon Redshift is enterprise-grade cloud data warehouse service that enables customers to easily analyze their data using standard SQL and existing business intelligence (BI) tools. Amazon Redshift now makes it easier for customers to run queries in AWS Glue Data Catalog by automatically mounting the Data Catalog to their Amazon Redshift cluster. This eliminates the need to manually configure external tables and enables customers to optimize query performance and get insights faster.

Five Actionable Steps to GDPR Compliance (Right to be Forgotten) with Amazon Redshift

GDPR (General Data Protection Regulation) grants individuals the right to request the deletion of their personal data held by organizations. This means companies must erase their personal data from their systems and any third parties with whom they have shared the data. Amazon Redshift provides customers with the tools to achieve GDPR compliance (right to be forgotten) through five actionable steps: identify the data subject, process request, delete personal data, audit logs, and revoke access.

Use AWS Glue DataBrew Recipes in Your AWS Glue Studio Visual ETL Jobs

AWS Glue Studio now integrates with AWS Glue DataBrew. AWS Glue Studio is a graphical interface that makes it easy to create, run, and monitor extract, transform, and load (ETL) jobs in AWS Glue. DataBrew makes it possible to clean and normalize data without writing any code. Customers can now run DataBrew recipes within AWS Glue Studio visual ETL jobs, including support for preview, preview batch, and execute jobs. This makes it easier to develop and test ETL jobs faster.

Find the Best Amazon Redshift Configuration for Your Workload Using Redshift Test Drive

Amazon Redshift provides customers with various deployment options (instance types, cluster sizes, etc.) that allow them to find the best configuration for their workloads. With the launch of Amazon Redshift Test Drive, customers can now quickly and cost-effectively experiment with different configurations and determine which one works best for their use case. Test Drive provides customers with access to a low-cost Amazon Redshift cluster that they can use to test out different configurations and determine the best one for their needs.

Near-Real-Time Analytics Using Amazon Redshift Streaming Ingestion with Amazon Kinesis Data Streams and Amazon DynamoDB

Amazon Redshift enables customers to run and scale analytics in seconds. It provides customers with near-real-time insights by automatically provisioning and scaling data warehouse compute capacity. Customers can use Amazon Kinesis Data Streams and Amazon DynamoDB to ingest data into Amazon Redshift and quickly get insights.

Improved Scalability and Resiliency for Amazon EMR on EC2 Clusters

Amazon EMR is the cloud big data solution for petabyte-scale data processing, interactive analytics, and machine learning. AWS recently introduced several features that improve the scalability and resiliency of Amazon EMR on EC2 clusters, such as support for Amazon EC2 Spot Instances, cluster scaling, and automated cluster recovery. These features make it easier for customers to optimize their Amazon EMR clusters and improve cost-efficiency.

End-to-End Development Lifecycle for Data Engineers to Build a Data Integration Pipeline Using AWS Glue

Data is a key enabler for businesses and AWS Glue is a serverless data integration service that makes it simple to integrate your data from multiple sources on serverless infrastructure for analysis, machine learning (ML), and application development. To maximize the power of data, it’s recommended to design an end-to-end development lifecycle that includes creating a development environment, authoring Glue ETL jobs, testing and deploying, and monitoring the data pipeline.

Build Data Integration Jobs with AI Companion on AWS Glue Studio Notebook Powered by Amazon CodeWhisperer

AWS Glue Studio now integrates with Amazon CodeWhisperer, a visual data preparation tool that makes it possible to build data integration jobs using AI Companion. AI Companion helps customers build, test, and deploy data integration jobs faster by automatically generating code based on user interactions with datasets. AWS Glue Studio Notebook with AI Companion makes it easy to develop and debug data pipelines without writing code.

Introducing the Vector Engine for Amazon OpenSearch Serverless, Now in Preview

Amazon OpenSearch Serverless now offers a vector engine for similarity searches that makes it possible to build modern machine learning (ML) augmented search experiences and generative artificial intelligence (AI) applications without managing and scaling clusters. The vector engine provides a simple, scalable, and high-performing way to search for similar items in a dataset.

Alcion Supports Their Multi-Tenant Platform with Amazon OpenSearch Serverless

Alcion, a security-first, AI-driven backup-as-a-service (BaaS) platform, helps Microsoft 365 administrators protect data from cyber threats and accidental data loss. In the event of data loss, Alcion customers need to search metadata for the backed-up items (files, emails, contacts, events, etc.). They leverage Amazon OpenSearch Serverless to quickly search for the relevant items and get their customers back up and running.

Configure Monitoring, Limits, and Alarms in Amazon Redshift Serverless to Keep Costs Predictable

Amazon Redshift Serverless makes it possible to run and scale analytics in seconds, automatically provisioning and scaling data warehouse compute capacity. To keep costs predictable, customers can configure monitoring, limits, and alarms in Amazon Redshift Serverless. This makes it possible to understand usage patterns and take proactive steps to optimize costs.

Enable Data Analytics with Talend and Amazon Redshift Serverless

Talend Cloud and Talend Stitch now integrate with Amazon Redshift Serverless to enable customers to accelerate and scale data analytics without needing to manage infrastructure. This makes it possible for customers to minimize computing needs for different workloads, including spikes and ad hoc analytics, and optimize query performance.

KeyCore and Our Offerings

KeyCore is the leading Danish AWS consultancy, providing professional services and managed services to our customers. Our team of experts has deep experience and a proven track record of helping customers make the most of their data and optimize their AWS environment. We can help your organization develop and implement an end-to-end data strategy that leverages serverless technologies like Amazon Athena, Amazon Redshift Serverless, and AWS Glue Studio. Our team of AWS Certified Solutions Architects, Data Engineers, and DevOps Engineers can help you build data pipelines, develop ETL jobs, and configure monitoring and alarms to keep costs predictable. Contact us today to learn how we can help you get the most value out of your data.

Read the full blog posts from AWS

Networking & Content Delivery

Tracking Pixel Driven Web Analytics with AWS Edge Services

Understanding web traffic and user behavior is essential in order to understand the results of new features, content updates, or current product iterations for websites and applications. This helps in getting insights into not only who visits the website but also where they come from and what content is viewed. A popular technique to do this is called ‘web beacons’.

A web beacon is a transparent image embedded in a web page that can be used to detect when a user visits a page, the IP address of the user, and other information. This can be used to track user behavior or to see if an email is opened, all of which are important for web analytics. However, this technique has some drawbacks when used at scale, such as increased latency and lower performance for end users and higher costs for the website.

AWS Edge Services can be used to solve these issues. By leveraging edge services like AWS Lambda@Edge, Amazon CloudFront, and Amazon CloudWatch, websites can perform web analytics at scale without the drawbacks associated with web beacons. This blog post will explain how to set up tracking pixel driven web analytics with AWS Edge Services.

First, you will need to set up an Amazon CloudFront distribution for your website. With CloudFront, you can configure it to forward requests to your origin server when requested content is not available in the CloudFront cache. This will ensure that your users always receive the latest content.

Next, you will need to create an AWS Lambda@Edge function. This function will be triggered when a user visits your website and will be responsible for running the analytics. It should collect the user’s IP address, the page they visited, and other relevant information and then forward it to your web analytics tool of choice.

Finally, you will need to set up Amazon CloudWatch to monitor the performance of your analytics. With CloudWatch, you can keep track of how long it takes for the analytics to run and how many users are actually visiting your website. This will help you identify when you need to scale your system or make improvements to ensure that your analytics are running efficiently.

By using AWS Edge Services, you can easily set up tracking pixel driven web analytics and monitor how well they are performing. This will help you better understand the impact of new features, content updates, or current product iterations for your website or application, allowing you to make more informed decisions.

At KeyCore, we offer a variety of professional and managed services to help our customers with their AWS needs. Whether it’s setting up and optimizing analytics pipelines or managing CloudFront distributions, we have the expertise to help you get the most out of AWS Edge Services. Contact us today to learn more.

Read the full blog posts from AWS

AWS Compute Blog

Introducing the AWS Lambda Python 3.11 Runtime and Migrating from Go1.x Runtime to the Custom Runtime on Amazon Linux 2

Introducing Python 3.11 Runtime in AWS Lambda

Amazon Web Services (AWS) has recently added the Python 3.11 runtime to Lambda, allowing customers to create and deploy functions using the language. Building and deploying functions is available through a range of methods, including the AWS Management Console, AWS CLI, AWS SDK, AWS SAM, AWS CDK, and Infrastructure as Code (IaC). Additionally, the Python 3.11 container base image can be used if customers prefer to build and deploy their functions through container images.

Migrating from Go1.x Runtime to the Custom Runtime on Amazon Linux 2

AWS Lambda is deprecating the go1.x runtime in line with Amazon Linux 1 end-of-life, scheduled for December 31st, 2023. As such, customers using Go with Lambda should migrate their functions to the provided.al2 runtime. The primary benefits of this migration are support for AWS Graviton2 processors with better price-performance and a streamlined invoke path with faster performance.

KeyCore: Helping Customers Manage and Optimize their AWS Lambda Functions

At KeyCore, we understand how important it is for customers to ensure their AWS Lambda functions are up-to-date and running at peak performance. As such, our team of certified AWS consultants provide professional services and managed services to help customers migrate to the latest AWS Lambda runtime and optimize their functions.

If you’d like to learn more about how KeyCore can help you manage and optimize your AWS Lambda functions, please reach out to our team today.

Read the full blog posts from AWS

AWS for M&E Blog

How Formula 1 Leveraged AWS to Accelerate the Future of Motorsports

Five years ago, Formula 1 embarked on a cloud transformation journey with AWS to accelerate their sports operations. Chris Roberts, the organization’s Director of IT, spearheaded this project, and recently spoke to Neil Ralph, AWS Principal Sports Partnership Manager, about the incredible progress Formula 1 has made. From streaming the race to Carlos Sainz Jr., #55 Ferrari, during the AWS Gran Premio de Espana Practice 3 to providing a plethora of advanced analytics, Formula 1 has found a reliable partner in AWS.

Schedule Live Linear Channel Playout with AWS Elemental MediaTailor

OTT platforms are becoming increasingly popular, and a great way to monetize content is by scheduling live linear TV playouts. AWS Elemental MediaTailor is a powerful tool for this kind of media delivery, with a variety of features that allow users to create an immersive viewing experience. From the “Before-TV” experience, to ensuring viewers are able to watch their favorite content straight away, AWS Elemental MediaTailor provides an ideal solution for media customers.

Real-Time Analytics for Amazon IVS Live Streaming with Datazoom on AWS

Observability of real user experiences is paramount for streaming service operators. And Datazoom on AWS can help capture events from applications or video players. With this data, streaming service operators can gain insights about user experiences, measure quality of service metrics, as well as observe performance issues to ensure user engagement. Thanks to Datazoom on AWS customers can also extract various content consumption and engagement metrics in real-time.

A Joint Venture: FOX and AWS for AI-Powered Highlights

To keep up with the ever-changing demands of sports fans, FOX and AWS have teamed up to deliver a groundbreaking “Recap Feature”. This Catch Up With Highlights feature is powered by Amazon Media Replay Engine (MRE), which uses AI to generate instant highlights from live streams and create a truly immersive viewing experience. Through this partnership, FOX and AWS were able to provide fans with instant highlights regardless of when they tuned in.

At KeyCore, we help companies navigate and implement complex, cloud-native solutions. Our team of experts can help you save time and money with our cloud services, from advanced analytics solutions to streaming live linear channels. Contact us today to learn more.

Read the full blog posts from AWS

AWS Developer Tools Blog

Configure Endpoint URLs for API Requests with the AWS SDKs and Tools

The AWS SDKs and Tools team is excited to announce improvements that give users increased flexibility when configuring their endpoint URLs used for API service requests. Prior to this announcement, users could only specify the endpoint URL used for AWS requests by setting the –endpoint-url command line parameter or by modifying the SDK configuration file. Now, users can set environment variables to provide the endpoint URL for all their requests without the need for manual configuration.

Configuring with –endpoint-url Command Line Parameter

The –endpoint-url command line parameter is a great option for users who want to quickly test a feature or run an experiment against an alternate API endpoint. It is also useful for scenarios where the default endpoint URL is configured in the SDK configuration file but an alternate endpoint URL is needed for a particular request.

Configuring with SDK Configuration File

The SDK configuration file allows users to set the default endpoint URL that will be used for all requests. The configuration file is useful for users who need to make multiple requests to the same API endpoint. Users can set the configuration file once and not worry about manually configuring the endpoint URL for each request.

Configuring with Environment Variables

The latest improvements provide users with the ability to set environment variables to provide the endpoint URL for all their requests. This feature is particularly useful for users who have multiple API endpoints and would like to easily switch between them. Environment variables are also useful for scenarios where the default endpoint URL is configured in the SDK configuration file but an alternate endpoint URL is needed for a particular request.

How KeyCore Can Help

At KeyCore, we understand that choosing the right tools for your API requests is critical to the success of your applications. Our experienced team of AWS experts can help you determine the best option for your specific needs, and ensure that your APIs are properly configured with the AWS SDKs and Tools. Contact us today to learn more about how KeyCore can help you get the most out of your AWS SDKs and Tools.

Read the full blog posts from AWS

AWS Architecture Blog

Unstructured Data with Multilingual Semantic Search Explained

Organizations that need to search through vast amounts of unstructured data across multiple languages often face a unique set of challenges. In this two-part blog series, we explore the fundamental AWS architecture needed to build a content repository with dynamic access control and multilingual semantic search capabilities.

Part 1: Setting up the Foundation

The key components of this architecture in Part 1 were a dynamic access control-based logic with a web UI to upload documents. This access control-based logic was implemented with Amazon Cognito to securely authenticate users and provide access control for the content. In addition to this, Amazon CloudFront was used to provide a secure connection from the web UI to the content repository.

Part 2: Building the Multilingual Semantic Search

In Part 2, we built on the foundation created in Part 1 to add the multilingual semantic search capabilities. The search capability was implemented using a combination of Amazon Translate to generate language-specific versions of search terms, and Amazon Elasticsearch Service to serve the search results.

The Amazon Translate service was used to convert the search term into specific languages, with each language generating its own search index in Amazon Elasticsearch Service. This allowed the repository to search through documents in multiple languages and still surface relevant results.

Key Takeaways

This two-part series of blog posts shows how organizations can use AWS to build a content repository with secure access control and multilingual semantic search capabilities. With Amazon Cognito providing access control, and Amazon Translate and Amazon Elasticsearch Service providing search capabilities, organizations can securely search through vast amounts of unstructured data that spans multiple languages.

At KeyCore, we offer both professional and managed services to help organizations build their own content repositories with multilingual semantic search capabilities. Our team of experienced AWS Certified Solutions Architects and DevOps engineers can help you build a secure and reliable architecture that fits your business needs. Contact us today to learn more!

Read the full blog posts from AWS

AWS Partner Network (APN) Blog

Streamline Your Software Development Process With Data-Driven Insights

Software development processes can be complex and often require a single dashboard to provide insights into business, process and people-centric metrics. Apexon’s COMPASS solution uses machine learning to capture and integrate data across the software development lifecycle, providing visibility into resource performance and its impact on KPIs. The solution offers a unified view of the development process, allowing teams to track their progress, identify areas of improvement, and adjust strategies in real-time.

Extend Your AWS Native Cloud Capabilities With TD SYNNEX and CoreStack

Cloud technology is essential for modern businesses, offering scalability, agility, and cost-efficiency. TD SYNNEX and CoreStack have teamed up to deliver an innovative solution combining the AWS Well-Architected Framework and CoreStack’s AI-powered NextGen cloud governance and FinOps capabilities. This solution enables customers to optimize cloud costs and improve cloud operations.

Get Started With AWS Backups for VMware Cloud on AWS

Automatic infrastructure provisioning and cloud-based remote backup and recovery solutions have made it easier for organizations to set up advanced data protection strategies tailored for cloud-native workloads. Before adopting AWS Backup using private interface VPC endpoints (powered by AWS PrivateLink), customers should consider the architectural design and considerations.

Cloud FinOps for the Enterprise with Ness Digital Engineering

Ness Digital Engineering helps organizations ensure their AWS infrastructure supports strategic initiatives. They have an approach for enterprises to align cloud costs with their business value, which involves various change management challenges. A robust FinOps function can address chronic issues related to cloud cost management at large enterprises.

AWS Partner Experience Transformation Roadmap

AWS is transforming the way partners work together, providing new opportunities at every stage of the journey. In the coming months, AWS will be making key enhancements to deepen partner expertise and broaden the value across business models. This includes providing a single, unified view of opportunities, enabling sellers to work together, and offering new ways to transact through AWS Marketplace.

Optimize Your Contact Center Experience With Amazon Connect Ready Partners

The contact center and customer experience industry is evolving rapidly, and businesses are investing in solutions such as Amazon Connect to streamline their systems. To support customer engagement across multiple channels, Amazon Connect Ready specialization has been introduced. This specialization includes enhanced data collection, analytics, insights, and optimization dashboards.

Accelerate Your Digital Transformation With Amazon ECS Delivery Partners

Digital transformation can be accelerated by running applications in containers. To help customers streamline containerized workload implementations, the Amazon ECS Delivery specialization has been introduced. Amazon ECS Delivery Partners provide expertise in assessing, scoping, designing, integrating, deploying, operating, and optimizing container environments using either Amazon Elastic Container Service (Amazon ECS) or AWS Fargate.

Making Data-Driven Decisions With IBM

IBM is an open data lakehouse on AWS that enables customers to scale analytics and AI with a fit-for-purpose data store. It provides querying, governance, and open data formats for easy data access and sharing, and customers can connect to their data within minutes. With IBM, customers can gain trusted insights quickly and reduce data warehouse costs.

Prevent API Breaches Using Salt Security With AWS WAF and Amazon API Gateway

Salt Security’s platform allows for the analysis of API traffic that exposes complex attacks, including those identified in the OWASP API Security Top 10 list. Salt Security uses big data and AI/ML to detect and stop complex API attacks, allowing organizations to protect their APIs from malicious attacks.

Fully Managed Data Access Governance in Amazon Aurora With Privacera

Data access governance is essential for ensuring data is consistent and trustworthy, and Privacera enables data access governance on Amazon Aurora. It includes tag-based access control policies and masking/filtering columns or rows. This allows organizations to define, monitor, and manage who has access to specific pieces of data according to their policies and external regulations.

Drive Electric Vehicle Battery Supply Chain Transparency on AWS With Amplo Global

The increasing demand for electric vehicle (EV) batteries has put the spotlight on the environmental and social impact of battery production and procurement. Amplo Global developed an AI-led platform on AWS to provide an end-to-end traceability of EV battery production, and it enables businesses to create a transparent and resilient supply chain less prone to disruption.

Leverage Confluent and AWS To Solve IoT Device and Data Management Challenges

IoT often involves an enormous number of connected devices and data. Confluent and AWS can be leveraged to solve device management and data management challenges encountered in IoT use cases. These solutions allow for microcomputing capabilities in nearly every type of device, enabling a fully-connected world.

How KeyCore Can Help With Your Software Development and Cloud Transformation

KeyCore, the leading Danish AWS consultancy, provides professional and managed services to help companies streamline their software development processes and accelerate their cloud transformation. KeyCore can help customers design and implement the best solution that fits their needs, using the latest AWS technologies such as AWS WAF, Amazon API Gateway, Amazon Connect, Amazon ECS, AWS Backups, Confluent, IBM Watsonx.Data, Privacera, Salt Security, TD SYNNEX, and CoreStack. Our team of experts and certified AWS professionals have the knowledge and experience to help you leverage the best of the cloud.

Read the full blog posts from AWS

AWS Cloud Enterprise Strategy Blog

The Potential of Generative AI and Collaboration to Transform Businesses

Thomas Edison’s invention of the first practical incandescent light bulb in his New Jersey laboratory lit up Christie Street on New Year’s Eve 1879 and paved the way for future technological advancement. While it may seem like magic today, it is only through the power of advanced technology that we can unlock the potential of enterprise transformation. This is particularly true when it comes to generative AI and collaboration.

Unlocking the Potential of Generative AI

Generative AI holds immense potential for transforming businesses. With its ability to automate labor-intensive and time-consuming tasks, AI can free up resources and accelerate the rate of innovation. It’s no wonder that many businesses are turning to AI-driven solutions to optimize their operations.

One of the key areas in which AI can be employed is in the analysis and optimization of data. By using AI to analyze large datasets, businesses can identify patterns and trends that can help them uncover new opportunities and gain a greater understanding of their customers and markets. Additionally, AI-driven solutions can be used to streamline processes and automate repetitive tasks, helping businesses reduce costs and improve efficiency.

The Role of Collaboration in Unlocking this Potential

In order to unlock the potential of generative AI, businesses must also consider how collaboration plays a role in the equation. With the rise of cloud-based technologies, it is now easier than ever for teams to collaborate in near real-time. This enables businesses to quickly share ideas, receive feedback, and make data-driven decisions.

Moreover, AI can be used to facilitate collaboration between teams and departments by providing insights that can help them better understand their customers and markets. This can lead to more informed decisions and greater success in the long run.

How KeyCore Can Help

At KeyCore, we specialize in providing AWS-based services and solutions to help businesses unlock the potential of generative AI and collaboration. Our team of AWS experts can help you get up and running with the latest technologies so that you can take full advantage of AI and collaboration. From setting up your environment to optimizing your operations, we can help you every step of the way. Contact us today to learn more.

Read the full blog posts from AWS


Using Digital Technologies to Enhance the Circular Economy

The circular economy is an important concept for businesses looking to reduce their environmental impact and become more sustainable. Digital technologies can help companies make the most of their resources by increasing efficiency, reducing waste, and allowing them to stay up-to-date with the latest trends. In this post, we’ll explore the benefits of digital technology for the circular economy and provide guidance on how businesses can use digital technology to their advantage.

Advantages of Digital Technologies for the Circular Economy

Digital technologies can help businesses to achieve a more sustainable approach to their operations. By leveraging digital technologies, businesses can access more data about their processes and use this data to inform their decisions. This helps to ensure that resources are used efficiently and that businesses have access to the most up-to-date information about their industry. Additionally, digital technologies can help to reduce waste by enabling businesses to automate processes and reduce manual labor.

Implementing Digital Technologies for the Circular Economy

When implementing digital technologies for the circular economy, businesses should prioritize solutions that are both efficient and cost-effective. AWS can provide businesses with the tools they need to get the most out of their resources, including access to data and analytics resources, automation services, and managed services.

AWS’s data and analytics resources, such as Amazon QuickSight, can help businesses gain insights into their operations and make more informed decisions. Additionally, AWS’s automation services, such as Amazon EC2 and AWS Lambda, can help businesses automate their processes and reduce manual labor. Finally, AWS’s managed services, such as Amazon ECS and AWS Fargate, can help businesses manage their resources more effectively.

How KeyCore Can Help

At KeyCore, we provide professional and managed services that can help businesses transition to the circular economy. Our team of experienced AWS consultants can provide businesses with the guidance they need to identify the most cost-effective solutions for their operations. Additionally, our team of managed services specialists can help businesses get the most out of their AWS resources. Contact us today to learn more about how we can help your business transition to the circular economy.

Read the full blog posts from AWS

AWS Cloud Operations & Migrations Blog

Automate Migration and Maximize Cloud Adoption Benefits with AWS

Automate Servers to Join an Active Directory Domain with AWS Application Migration Service and AWS Systems Manager

AWS Application Migration Service (MGN) helps you quickly migrate source servers from physical, virtual, or cloud infrastructure to run natively on AWS. The post-launch actions feature in MGN allow you to control and automate actions after your servers have been launched in AWS. You can use predefined commands or custom scripts to join an Active Directory domain and post-launch actions to set up your IT environment. With MGN, you can quickly and securely migrate servers, automate the migration process, and ensure a seamless transition to the cloud.

Maximize Cloud Adoption Benefits with a Well-Architected Organizational Culture

Organizational culture is a powerful determinant of transformation success, and this is especially true with cloud transformation. The cloud’s extraordinary capabilities are limited only by the organization’s ability to embrace change and develop the skills needed to access and use the cloud’s capabilities. Understanding the impact of organizational culture on cloud transformation and ensuring that your organization has the right culture for success can help you maximize the benefits of cloud adoption.

Migrate to Amazon Managed Service for Prometheus with the Prometheus Operator

The Prometheus Operator simplifies the process of deploying and managing Prometheus clusters in Kubernetes. This blog post will show you how to use the Prometheus Operator to deploy Prometheus and migrate your monitoring workloads to Amazon Managed Service for Prometheus, making the most of its advanced automation features.

Identify Potential Issues with the Fault Tolerance Analyzer Tool

The resilience of workloads is a shared responsibility between AWS and its customers; while AWS provides the resilient underlying cloud infrastructure, customers maintain the resilience of their applications. The Fault Tolerance Analyzer Tool helps you to identify potential issues with your applications before they arise, allowing you to take corrective action and maintain the resilience of your workloads.

Achieve Testing Success with the AWS Application Migration Service for Painless Cutovers

Using the AWS Application Migration Service can simplify and expedite your migration to AWS. However, testing is essential for a successful migration, and part of that is identifying any dependencies on shared services such as Microsoft Active Directory, integrations with other services, and backup servers. This blog post will focus on four key areas that should be tested before and after a successful migration.

Enhance Observability for Network Load Balancers with Amazon CloudWatch Internet Monitor

Amazon CloudWatch Internet Monitor now provides internet performance and availability measurements for user traffic accessing specific Network Load Balancers. This blog post will show you how to take advantage of this new capability to enhance observability and ensure that your NLBs are responding to user requests in a timely and reliable manner.

Use AWS Systems Manager for VMware Cloud on AWS (VMC) Operations Management

Operating a hybrid cloud environment that spans VMC and cloud environments can be a challenge. AWS Systems Manager provides a single pane of glass for visualizing and acting on operational data, and deploying automation and control of cloud infrastructure. This blog post explains how you can use AWS Systems Manager to simplify and streamline the management of your hybrid cloud environment.

Best Practices for the Custom Lens Lifecycle: Measure and Improve

The AWS Well-Architected Custom Lens allows you to assess the architecture of your applications and workloads against your own best practices. This blog post outlines best practices and resources to help you build, validate, and improve an AWS Well-Architected Custom Lens, and roll it out across your organization.

At KeyCore, our experts are experienced in helping customers create and implement their own AWS Well-Architected Custom Lens. We provide the resources and guidance needed to ensure you have the best practices in place for your specific needs. Contact us today to learn more about how we can help you with your custom lens lifecycle.

Read the full blog posts from AWS

AWS for Industries

AWS for Industries

Working Backwards to Design a New User Experience for Virtual Engineering Workbenches on AWS

The Amazon Working Backwards mechanism and User Experience (UX) design can help accelerate the adoption of a cloud-based development environment called Virtual Engineering Workbench (VEW). VEW is designed to enable Software developers, Software integrators and Software testers to work more efficiently. The Working Backwards mechanism can be used to start with the user experience and design a product that meets the users’ needs. It is a process that includes analyzing user feedback, designing the experience, and creating a prototype. The UX design process involves creating a prototype that is tested with users to assess usability and gather feedback. The process is iterative and allows for rapid changes and improvements. As a result, the development environment can be tailored to the users’ needs and preferences.

Rethinking Energy with Generative AI

Cloud computing and machine learning (ML) technologies have revolutionized the energy industry. Digitalization has enabled new levels of operational efficiency, productivity, and safety, while also reducing environmental impact. For example, AI-based optimization tools can help to optimize energy usage and energy storage. Additionally, AI can be used to build predictive models for energy production and demand, and to detect anomalies in energy markets. ML systems can be used to automate decision making and streamline operations. By leveraging the power of AI, energy companies can become more efficient and cost-effective.

CGL Facility Management’s Technicians Use an App from AWS Partner QModo AI to Fix Equipment Fast

Companies that manage essential buildings such as healthcare facilities, manufacturing plants, and criminal justice centers have complex management and maintenance needs. They must adhere to strict regulations, compliance, and safety requirements. In order to meet these requirements, CGL Facility Management turned to AWS Partner QModo AI to develop an app to help technicians quickly and accurately diagnose and fix equipment. The app leverages the power of AI to detect anomalies and offer predictive maintenance recommendations. Additionally, the app can be used to manage complex parts inventories and to get real-time updates on the status of repairs.

Introducing AWS HealthScribe – Automatically Generate Clinical Notes from Patient-Clinician Conversations Using AWS HealthScribe

AWS HealthScribe is a HIPAA-eligible service that can help healthcare software vendors build clinical applications that generate preliminary clinical notes by analyzing patient-clinician conversations. Using AWS HealthScribe, it is possible to create applications that integrate with patient and clinician conversations, understanding the context and extracting relevant information. This can help improve the patient-clinician relationship and reduce administrative burden. Additionally, it can improve the accuracy of care decisions and reduce misunderstandings.

Integration of On-Premises Medical Imaging Data with AWS HealthImaging

Medical imaging data plays an important role in diagnosing and treating medical conditions. As the demand for diagnostic imaging services increases, healthcare organizations need to be able to access and manage medical imaging data quickly and accurately. AWS HealthImaging is a purpose-built service for medical imaging that can help healthcare organizations manage large volumes of data. It ingests data in the DICOM P10 format and provides APIs for low-latency retrieval. Additionally, HealthImaging can be used to share medical imaging data between healthcare organizations and to securely store medical imaging data in the cloud.

Powering Data Management on OSDU Data Platform

Data is an essential asset for energy companies and the OSDU Data Platform is designed to help manage data efficiently and securely. The platform is open-source-driven and provides a comprehensive view of trusted data that follows consistent business rules. Additionally, the platform allows for rapid data exploration, creating a foundation for data-driven decision making. By centralizing data on the OSDU Data Platform, energy companies can save time and resources and make more informed decisions.

Introducing AWS HealthImaging – Purpose-Built for Medical Imaging at Scale

AWS HealthImaging is a purpose-built service that helps builders develop cloud-native applications for storing, analyzing, and sharing medical imaging data at petabyte-scale. The service ingests data in the DICOM P10 format and provides APIs for low-latency retrieval and secure storage. Additionally, the service can be used to share medical imaging data between healthcare organizations. AWS HealthImaging can help healthcare organizations save time and resources by making it easier to manage medical imaging data and to securely store petabytes of data in the cloud.

At KeyCore, our expert team of AWS consultants can help healthcare organizations plan and implement AWS HealthImaging. Whether you’re looking to optimize your existing applications or to use AWS HealthImaging to create new ones, our team can help you get started. Contact us today to learn more about how we can help you make the most of AWS HealthImaging.

Read the full blog posts from AWS

AWS Marketplace

Best Practices for Receiving, Accepting, and Distributing Private Offers in AWS Marketplace

AWS Marketplace is a digital catalog of software products that help customers find, buy, and deploy software solutions on the AWS platform. It offers an extensive portfolio of software products that can be easily subscribed to and used on the AWS platform. Private offers enable companies to customize and manage the purchase and license deployment of products available in AWS Marketplace.

Receiving and Accepting Private Offers in AWS Marketplace

AWS Marketplace allows customers to find, purchase, and use software products on the AWS platform. Customers can search for products available in the marketplace and receive private offers from software vendors. To receive a private offer, customers must log in to AWS Marketplace and click “private offers” in the “all products” tab.

Once the customer receives the offer, they can review it and accept it if it meets their requirements. After accepting the offer, the customer will be able to view and manage their subscription from the AWS Marketplace dashboard.

Scaling Private Offers with Managed Entitlements

Managed entitlements enable customers to scale their private offers with the AWS Marketplace. Customers can use this feature to manage the license entitlements for their private offers. This allows customers to easily manage and track the licenses for their private offers without having to manually manage each individual license.

Enhancing Security with Delegated Administrator Accounts

Delegated administrator accounts provide an additional layer of security for customers using private offers in AWS Marketplace. This feature allows customers to create a separate account for their license entitlements, which can be used to manage and distribute licenses associated with their private offers. By creating a separate account for license entitlements, customers can ensure that their licenses are secure and only available to authorized users.

How KeyCore Can Help

KeyCore is the leading Danish AWS consultancy and can help customers with their software needs in AWS Marketplace. Our team of experienced experts can help customers to find the best products for their needs, set up and manage private offers, and ensure that their license entitlements remain secure. KeyCore can also help customers to write CloudFormation YAML code for automating and managing their license entitlements, as well as use AWS SDK for JavaScript v3 for making API calls. Customers can find more information about KeyCore and our offerings at

Read the full blog posts from AWS

Business Productivity

Amazon Chime SDK Enhances Media Pipelines for Streamlined Workflows

On July 24, 2020, Amazon Chime SDK announced major enhancements to the composition capabilities of media pipelines. These changes allow users to combine multiple video streams and screen sharing into a unified virtual presentation. With the new layout elements, users can overlay videos and create dynamic transitions to further enhance the visual experience.

Advanced Composition Capabilities

The new capabilities allow for advanced composition, including the ability to:

  • Combine multiple video streams
  • Overlay videos
  • Create dynamic transitions between video streams
  • Create 10 new layout elements
  • Include annotations in video streams
  • Use a variety of video formats

These enhancements enable users to create unified virtual presentations with greater precision, clarity, and detail. This makes it easier for users to collaborate in real-time.

How KeyCore Can Help

At KeyCore, we help organizations maximize the power of AWS to improve business productivity. Our AWS architects and developers can assist with the setup and integration of the Amazon Chime SDK and provide guidance on how to best leverage the new composition capabilities. Our DevOps teams can set up automated solutions to ensure your media pipelines are configured to meet your specific business needs.

Read the full blog posts from AWS

Front-End Web & Mobile

Using Amplify Studio to Build a Social Network App with Form Builder and Storage

Building an app that is heavily image dependent, like a social media app, requires managing file upload and access effectively. Doing this without the right tools can be difficult and time-consuming. Fortunately, Amazon Web Services (AWS) offers Amplify Studio – a suite of tools that can simplify the process and save developers time.

How Amplify Studio Can Help

Amplify Studio provides a suite of tools that makes it easy to integrate with AWS S3 storage. The Form Builder is part of this package, and it provides an interface for quickly and easily managing files like images. For example, the Form Builder can render an upload button for each file type, allowing users to easily upload images. It can also resize the images to fit the app’s needs, as well as store them in the right format and location.

Benefits of Amplify Studio

Using Amplify Studio to build a social network app with Form Builder and Storage provides a number of advantages. First, it provides an intuitive interface for users, allowing them to easily upload and manage images. It also simplifies the process of integrating with AWS S3 storage, as developers are no longer required to manually set up and configure the services.

Additionally, the Form Builder allows users to store and access images in the right format. This ensures that the app looks and performs the way it is intended to, as well as allows for faster loading speeds.

KeyCore Can Help

At KeyCore, our team of AWS experts can help you build a social network app with Amplify Studio. We provide both professional services and managed services, and our team of experts is highly skilled in AWS. We can provide you with the advice and guidance you need to make sure that your app meets your needs. Contact us today to learn more.

Read the full blog posts from AWS

Innovating in the Public Sector

Innovating in the Public Sector with AWS

Cloud technology has revolutionized the public sector worldwide. AWS has been at the epicenter of this revolution, enabling public sector organizations to reduce costs, innovate, and scale with ease. In this article, we explore how public agencies are leveraging the power of AWS to create new solutions for projects ranging from language translation to food insecurity.

Swindon Borough Council Leverages AI for Translation

Swindon Borough Council serves a multicultural community of 230,000 citizens in the south of England. To ensure that all citizens can access the services they need, the council is constantly researching and experimenting with new technologies. Recently, the council leveraged the power of AI and cloud services for a translation project. This project was a great success, with costs reduced by an amazing 99.6%, compared to traditional translation methods.

Public-Private Partnerships for Food Insecurity

Reducing food insecurity and providing access to healthy foods are top priorities for many governments. In order to accelerate progress, governments, private sector entities, and more are joining forces to rethink how people can make healthy eating both cost-effective and simple. AWS, Amazon Access, and state and local agencies are teaming up in this mission.

Automating Clinical Lab Workflow with the Cloud

BioMark is a digital diagnostics and therapeutics company that provides cutting-edge solutions to over 7,000 clinics and 63 hospitals in Southeast Asia. BioMark utilizes the AWS platform to enable medical professionals to quickly and accurately access lab results. This allows for faster diagnosis and treatment of patients in the countries they serve, including Malaysia, Singapore, Philippines, Thailand, Brunei, Vietnam, and Indonesia.

How KeyCore Can Help

At KeyCore, we understand the immense value that cloud technology can bring to the public sector. We specialize in AWS consulting, offering both managed and professional services for organizations of all sizes. Our team of experienced AWS consultants can help you develop and implement innovative solutions that leverage the power of the cloud. With our help, you can optimize your public sector operations and reduce costs, all while ensuring the highest levels of security and scalability. Contact us today to get started.

Read the full blog posts from AWS

The Internet of Things on AWS – Official Blog

Introducing the New Disconnected Duration Metric in AWS IoT Device Defender

AWS IoT Device Defender is the AWS service that helps customers protect their IoT devices with improved visibility and continuous auditability of their device fleets. Now customers can take advantage of its new metric, the disconnected duration metric, to monitor device connectivity status and duration of disconnections.

What is the Disconnected Duration Metric?

The disconnected duration metric in AWS IoT Device Defender provides customers the ability to monitor the connection status and duration of disconnection of their IoT devices. It is a useful tool for monitoring the performance of their IoT applications and troubleshooting any connectivity issues. The metric is available for customers using AWS IoT Device Defender Detect and customers can create alarms when devices become disconnected.

The Advantages of the Disconnected Duration Metric

The new disconnected duration metric offers IoT customers several advantages. It provides improved visibility into the operational performance of their IoT applications. Additionally, customers can use it to troubleshoot any connectivity issues they may experience with their IoT devices, and also to create alarms when devices become disconnected. Furthermore, the metric is available out-of-the-box, eliminating the need for customers to create custom self-managed solutions.

How KeyCore Can Help Customers Utilize the Disconnected Duration Metric

At KeyCore, our AWS Certified team has the expertise to help customers take advantage of the new disconnected duration metric. Our team can help customers set up the metric to monitor their IoT device connectivity status and duration of disconnection. We can also help them to create alarms when devices become disconnected to ensure that any operational issues with their devices are quickly addressed. Furthermore, our team can help customers to optimize their AWS IoT Device Defender Detect setup to ensure that their devices are always running as expected and that any connectivity issues are quickly identified and resolved.

If you are looking for help with the new disconnected duration metric or need assistance setting up AWS IoT Device Defender Detect, our team is here to assist. Contact us today to take advantage of our AWS expertise and to ensure that your devices are always running as expected.

Read the full blog posts from AWS

AWS Open Source Blog

Configure Keycloak On Amazon Elastic Kubernetes Service (EKS) Using Terraform


This blog post explains how to use Terraform to configure open source Keycloak on Amazon EKS, a managed service that makes it easy to run Kubernetes on AWS. We will discuss why Keycloak is a great tool for managing open source applications in the cloud, and how to use Terraform to deploy and configure the software. Lastly, we will discuss how KeyCore can help you manage open source applications in AWS.

What Is Keycloak?

Keycloak is an open source identity and access management solution that allows users to securely authenticate and manage access to applications and services. It enables administrators to control who can access what, when, and from where. Keycloak also has a comprehensive authentication system that supports SAML, OAuth, OpenID Connect, LDAP, and other protocols.

Why Use Keycloak On Amazon EKS?

Running applications on Amazon EKS is an efficient and secure way to manage open source applications in the cloud. Keycloak is an ideal solution for controlling access to open source applications on EKS. Keycloak integrates seamlessly with EKS, allowing users to easily manage authentication and authorization for their applications.

How To Configure Keycloak On Amazon EKS Using Terraform

Terraform is the most popular Infrastructure-as-Code tool and can be used to deploy and configure Keycloak on EKS. Using Terraform, users can quickly and easily configure Keycloak and ensure that all required resources are created and configured correctly.

First, users will need to create an Amazon EKS cluster. This can be done through the Amazon EKS console or via the Terraform AWS provider. Once the cluster has been created, users will need to configure Keycloak to use the cluster. This can be done using the Terraform Kubernetes provider. The provider allows users to deploy and configure Keycloak on EKS, ensuring that all required resources are created correctly.

Once Keycloak is configured, users can use it to manage authentication and authorization for their applications on EKS. For example, users can create authentication rules to control who can access what applications and services.

KeyCore: Your Partner for Managing Open Source Applications in AWS

At KeyCore, we provide both professional services and managed services to help you manage open source applications in AWS. Our team of AWS-certified experts can help you deploy and configure Keycloak on EKS, as well as provide assistance with other open source applications.

We are committed to helping you get the most out of your open source applications. Whether you need help configuring Keycloak on EKS or managing other open source applications in AWS, KeyCore is here to help. Contact us today to discuss how we can help you manage open source applications in AWS.

Read the full blog posts from AWS

Scroll to Top