Summary of AWS blogs for the week of monday Mon May 08

In the week of Mon May 08 2023 AWS published 97 blog posts – here is an overview of what happened.

Topics Covered

Desktop and Application Streaming

What’s New in Desktop and Application Streaming?

Leostream Announces Support for Amazon WorkSpaces Core

Today, the Leostream Platform announced support for Amazon WorkSpaces Core. With this new solution, organizations can provision, deploy, and manage Virtual Desktop Infrastructure (VDI) powered by WorkSpaces Core directly from the Leostream Remote Desktop Access Platform. This new solution combines Leostream’s platform with the security, global reliability, and cost efficiency of the AWS infrastructure.

Additionally, users benefit from improved flexibility since no two organizations are required to use the same combination of infrastructure, application delivery, and security policies. This new capability offers organizations the ability to quickly deploy and manage VDI, while also ensuring that users have secure access to the applications they need.

Gartner Digital Workplace Summit 2023 (EMEA)

It’s an exciting time in the End User Computing (EUC) industry. Organizations are quickly adopting digital workplace technology to meet the needs of their hybrid and remote employees. Not only do they need the right tools to do their job, they also need to ensure they’re getting the most out of their investments.

The Gartner 2023 EMEA Digital Workplace Summit is the perfect event for these organizations. At the summit, attendees will get an up-close look at the latest digital workplace trends, as well as best practices for deploying and managing digital workplace solutions. The event will also feature workshops, panel discussions, and keynotes from industry experts.

Using Serverless AWS Services as an External Authenticator for NICE DCV

NICE DCV is a high-performance remote display protocol that provides additional authentication flexibility through DCV external authentication. In order for a user to gain access to a secure DCV session stream, they must be authenticated against the display protocol. Typically, DCV uses system authentication, which delegates authentication to the underlying operating system.

With the use of external authentication, customers can use serverless AWS services such as Amazon Cognito and Amazon API Gateway to authenticate users against an authentication service of their choice. This provides organizations with the flexibility to authenticate against custom systems such as LDAP or Active Directory, while still taking advantage of the security and scalability of the AWS infrastructure.

At KeyCore, our team of AWS experts are highly experienced in helping organizations maximize the performance and cost savings of their digital workplace solutions. We can help you set up secure authentication for NICE DCV and any other cloud-native applications you may be running on the AWS platform. Contact us today to learn more.

Read the full blog posts from AWS

AWS DevOps Blog

S3 URI Parsing is Now Available for Java 2.x

The AWS SDK for Java team is pleased to announce the general availability of Amazon Simple Storage Service (Amazon S3) URI parsing in the AWS SDK for Java 2.x. Announcing the new parseUri() API and S3Uri class, developers can now easily retrieve the bucket, key, region, style, and query parameters with path-style and virtual-hosted-style S3 URIs. This feature can simplify the process of working with S3 resources in a Java environment.

What is S3 URI Parsing?

S3 URI Parsing is a method for developers to quickly access the components of an S3 URI. This includes the bucket, key, region, style, and query parameters. It is particularly useful for applications that may be working with multiple S3 resources, as it helps to organize and access the components quickly. Additionally, with parseUri() API, it is now possible to create an S3Uri object from a string containing an S3 URI.

How Does S3 URI Parsing Help?

S3 URI parsing can help developers quickly access the components of an S3 URI. It is highly effective for applications that work with multiple S3 resources – for example, an application that uses multiple buckets for different data sources. With the new API, developers can easily retrieve the bucket, key, region, style, and query parameters from within their programs. Additionally, the API provides a simplified way to access the components of the URI, instead of manually parsing the string for the required information.

What Are the Benefits of S3 URI Parsing?

The main benefit of S3 URI parsing is the ability to quickly access the components of an S3 URI. This makes it easier for developers to work with multiple S3 resources in a single application. Additionally, the API helps to simplify the process of accessing the components of the URI, as it eliminates the need to manually parse the string for the required information.

How Does KeyCore Help?

At KeyCore, we are experts in AWS and providing professional and managed services. Our team of AWS certified professionals can help you make sure your applications are making the most of S3 URI Parsing. Whether you need help setting up the API or integrating it into an application, our team can provide the expertise and industry best practices to make sure your application is running smoothly.

Read the full blog posts from AWS


Automated and Simplified SAP HANA Backups with AWS Backup

SAP HANA workloads running on Amazon Web Services (AWS) are essential for Enterprise operations, handling critical business processes including finance, procurement, and payroll. To protect and ensure the reliability of data within these systems, an effective backup and restore approach is necessary. AWS Backup provides a convenient and automated solution to this requirement, and this article will provide an introduction to how to use it.

Overview of AWS Backup

AWS Backup is a service that helps customers automate and manage backups of their AWS resources, such as Amazon Elastic Compute Cloud (EC2) instances, Elastic Block Store (EBS) volumes, Amazon RDS databases, and Amazon Redshift clusters. In addition, AWS Backup enables customers to set up policies to automate backups and management of their data.

AWS Backup is integrated with several AWS services, such as Amazon EBS, Amazon DynamoDB, and Amazon RDS. This integration helps customers automatically back up their data according to the defined policies. Moreover, AWS Backup provides customers with a unified console to manage their backups, as well as track their storage consumption.

Using AWS Backup for SAP HANA

AWS Backup can be used to take regular backups of SAP HANA databases running on AWS. AWS Backup helps customers configure data policies for SAP HANA backups, including regular backups and on-demand backups. In addition, customers can use AWS Backup to retain backups for a specified period of time.

AWS Backup is integrated with the AWS Management Console, allowing customers to quickly and easily manage their SAP HANA backups. In addition, customers can use AWS Backup to monitor their storage consumption and make sure their backups are compliant with their compliance requirements.

How KeyCore Can Help

KeyCore is a leading Danish AWS consultancy that provides both professional services and managed services. Our extensive experience with AWS Backup enables us to help customers configure their SAP HANA backups and ensure their data is securely and reliably stored. KeyCore can also help customers with their backup and restore needs, ensuring their data is always safe and secure.

With our help, customers can rest assured that their data is protected and reliably backed up with AWS Backup. Contact us today to learn more about how KeyCore can help you maximize the value of AWS Backup for SAP HANA.

Read the full blog posts from AWS

Official Machine Learning Blog of Amazon Web Services

Accelerating Machine Learning Workflows with Amazon Web Services

Amazon Web Services (AWS) provides a wide range of services to help businesses quickly and easily develop, train, and deploy machine learning (ML) models. With Amazon SageMaker, customers can quickly spin up and collaborate on notebooks, unlock insights from data stored in Amazon S3 using Amazon Kendra, and operationalize ML models in production using the Amazon SageMaker Model Registry. Additionally, customers can utilize TensorBoard for hosted debugging, reduce inference cost with AWS Graviton, and use Amazon QuickSight to publish predictive dashboards.

Accelerate Model Training with Debugging and Optimization

Amazon SageMaker comes with two options to spin up fully managed notebooks for exploring data and building ML models: fast start, collaborative notebooks accessible within Amazon SageMaker Studio—a fully integrated development environment (IDE) for ML; and Amazon SageMaker Canvas, a visual interface that enables business analysts to generate accurate ML predictions on their own—without requiring any ML experience or having to write any code.

To help data scientists identify and remediate model training issues to meet accuracy targets for production deployment, AWS has introduced a hosted TensorBoard experience within Amazon SageMaker. The toolkit allows data scientists to visualize and analyze various ML training jobs by leveraging the TensorFlow and Keras APIs.

AWS is also contributing to Project Jupyter, a multi-stakeholder, open-source project that builds applications, open standards, and tools for data science, ML, and computational science. With the Amazon SageMaker JupyterLab extension, you can now schedule notebooks from any JupyterLab environment for batch jobs, scaling up ML workloads.

Reduce Inference Cost and Publish Predictive Dashboards

Amazon SageMaker provides a broad selection of ML infrastructure and model deployment options to help meet inference needs. Additionally, AWS Graviton can help reduce inference cost. To further help with MLOps tools, Amazon QuickSight can be used to publish predictive dashboards.

Additionally, Amazon SageMaker Serverless Inference provides the ability to serve model inference requests in real time without having to explicitly provision compute instances or configure scaling policies to handle traffic variations. AWS has also announced provisioned concurrency for Amazon SageMaker Serverless Inference.

Deploy Generative AI and Analyze Unstructured Healthcare Data

AWS is contributing to democratize generative AI, and to help with that, the company has open sourced a new technology called ESMFold, a language model for protein structure prediction on Amazon SageMaker.

To analyze unstructured healthcare data, customers can use Amazon HealthLake, an ML-powered service that transforms, analyzes, and discovers insights from unstructured healthcare data.

Secure MLflow with AWS Native Services

Customers already using MLflow, an open-source platform for managing ML workflows, can secure their environment with AWS native services. MLflow also allows customers to host ML models on Amazon SageMaker using Triton, taking advantage of Python backend and TensorRT models.

How KeyCore can Help

At KeyCore, we are confident that our professional services and managed services can help businesses quickly and easily develop, train, and deploy ML models on AWS. With our expertise in AWS, we can develop custom solutions tailored to your specific needs and help you leverage ML models to unlock business insights. Our team of experts also provide deep expertise in MLflow, helping you build and secure ML workflows. Contact us to find out how we can help you get the most out of AWS.

Read the full blog posts from AWS

Announcements, Updates, and Launches

Announcements, Updates, and Launches

AWS is continually making improvements and introducing new products to their suite of offerings. Recently, AWS announced a Serverless Innovation Day, the introduction of Aurora I/O-optimized Cluster Configuration, and Storage-optimized Amazon EC2 I4g Instances. Additionally, they had a Week in Review summarizing the most recent innovations.

AWS Serverless Innovation Day

On Wednesday, May 17th, AWS is hosting a free virtual event, AWS Serverless Innovation Day. During the event, attendees can learn about how to modernize applications using AWS Serverless technologies and event-driven architectures from customers, experts, and leaders. This event is a great opportunity to empower builders and technical decision-makers with knowledge on AWS Serverless technologies such as AWS Lambda, Amazon Elastic Container and more.

Aurora I/O-Optimized Cluster Configuration

AWS has introduced an I/O-Optimized Cluster Configuration for Amazon Aurora. Hundreds of thousands of customers have already chosen Aurora to run their demanding applications due to its high performance, availability, and MySQL/PostgreSQL compatibility. This new configuration offers customers up to 40% cost savings for I/O-intensive applications.

Storage-Optimized Amazon EC2 I4g Instances

AWS has also introduced new storage-optimized Amazon EC2 I4g Instances, powered by AWS Graviton2 processors. These instances offer up to 15% better compute performance than their other storage-optimized instances, with up to 64 vCPUs, 512 GiB of memory, and 15 TB of NVMe storage. These new instances are ideal for storage-intensive applications.

AWS Week in Review

The AWS Week in Review summarizes recent developments in the AWS ecosystem. In the most recent review, they discussed AWS user notifications, the upcoming Serverless event, and more. The review provides a great overview of the latest AWS features and is a great resource for keeping up with the ever-evolving AWS suite.

At KeyCore, our AWS certified consultants can help you take advantage of the many AWS innovations. Our experienced consultants can assist with everything from serverless architectures to storage-intensive applications. Contact us today to see how we can help you make the most of AWS.

Read the full blog posts from AWS


Container Platform Modernization with Amazon EKS, Karpenter, and Dagster

Condé Nast Modernizes their Container Platform on Amazon EKS

Condé Nast is a global media company that is home to iconic brands such as Vogue, GQ, AD, and many more. They started their journey in containerized applications in 2014 and have continued to modernize their approach to containerized applications ever since.

Using Amazon Elastic Kubernetes Service (Amazon EKS), Condé Nast was able to reach their goals of being highly available, cost efficient, and secure. Amazon EKS also provides an automated management mechanism for their containers, which allowed them to easily deploy and manage new applications. With their container platform on Amazon EKS, they are able to scale more quickly and cost-effectively than if they were managing the platform themselves.

Sentra Maximizes Cost-Efficiency Using Amazon EKS

Sentra is leveraging Amazon EKS and other AWS services to maximize cost-efficiency while minimizing operational overhead. Sentra is using Karpenter, an open-source version of Dagster, a cloud-native orchestrator, to run efficient and scalable data workflows and processing workloads on AWS Fargate and EC2 Spot. This allows them to only pay for the compute power that they need and then scale up or down when needed.

Amazon EKS provides an easy to use and secure way to manage and scale their container-based applications. This allows them to rapidly deploy and manage their applications without the need for manual intervention.

AWS Lambda for the Containers Developer

When deciding between AWS Lambda and a containers product such as Amazon ECS or Amazon EKS, there are many factors to consider, such as cost, scaling properties, and application type.

AWS Lambda is a serverless compute service that runs code in response to events and automatically manages the compute resources required. This can reduce costs and increase scalability when comparing to running a container-based application.

However, it may not be the right choice for all applications since it is designed to run stateless applications. For applications that need an entire development stack, containers are a better option since they can provide a more comprehensive development environment.

At KeyCore, we can help you make the right decision for your application. Our team of AWS Certified Solutions Architects and DevOps Engineers have the experience and expertise to help you implement the right solution for your applications. Contact us today to learn more about how we can help you.

Read the full blog posts from AWS

AWS Quantum Technologies Blog

Exploring Quantum Chemistry Applications with Tangelo and QEMIST Cloud using Amazon Braket

Amazon Braket is an ideal platform to explore the potential of quantum computing for non-trivial molecular systems. In this blog post, we will show how our partner Good Chemistry can help to design experiments that can be run on Amazon Braket.

Using Tangelo for quantum chemistry simulations

Good Chemistry provides Tangelo, a quantum chemistry simulation tool, which can be used to explore the behaviour of molecules in the quantum realm. This is done by solving the Schrödinger equation, which is the core equation used to describe the behaviour of quantum systems. By using Tangelo, users can perform calculations that would previously have been impossible to do on classical computers.

QEMIST Cloud for running experiments on Amazon Braket

Good Chemistry also provides QEMIST Cloud, a cloud-based platform for running experiments on Amazon Braket. It provides users with an intuitive interface for setting up and running experiments on AWS, and makes it easy to store and analyse data from Amazon Braket. With QEMIST Cloud, users can quickly design and execute experiments that would have been too complex or time consuming to do on a classical computer.

How KeyCore can help

At KeyCore, we are experts in cloud computing and AWS solutions. We provide professional services and managed services to help our customers design and deploy Amazon Braket experiments with Tangelo and QEMIST Cloud. Moreover, we are able to provide the necessary consulting, training and development services to help customers get up and running quickly.

Our team of experienced AWS consultants can help you get the most out of Amazon Braket and other AWS solutions. We can provide you with the expert advice and guidance you need to design and deploy your quantum computing experiments. Get in touch with us today to find out more about our services and how we can help you take your quantum computing experiments to the next level.

Read the full blog posts from AWS

AWS Smart Business Blog

Modernizing with Serverless Computing unlocks VBA Software’s Potential

Introducing VBA Software

VBA is a technology firm based in Wisconsin, US, that provides comprehensive software solutions to the health insurance industry. Their flagship offering, VBA Software, provides customers with secure and flexible claims processing. This makes it a great choice for small and medium businesses (SMBs) looking to modernize their systems and unlock new opportunities.

Serverless Computing Speeds Up Claims Processing

Through modernization and the use of serverless computing, VBA has been able to speed up claims processing time from minutes to seconds. By breaking down the claim into smaller tasks and utilizing serverless computing, VBA Software is able to quickly process thousands of claims and provide an efficient service to their customers.

Business Intelligence helps SMBs stay competitive

Business intelligence (BI) is growing more and more important for businesses of any size, particularly SMBs. Data analytics tools can be of great help in gaining a competitive edge. Those SMBs that use BI are twice as likely to report revenue growth compared to those who do not.

KeyCore can help Your Business Modernize

At KeyCore, we provide both professional services and managed services for AWS. We understand the importance of modernization for any business, and our experienced team of consultants can help you make the most of AWS’s technology. Get in touch with us today to find out how we can help you unlock the full potential of your business.

Read the full blog posts from AWS

Official Database Blog of Amazon Web Services

Automating Database Benchmark Tests for Amazon Aurora PostgreSQL

Optimizing a database is a key part of managing application workloads, and benchmark tests can help with this process. Amazon Aurora PostgreSQL-Compatible Edition makes it easy to conduct these tests. This article will provide an overview of the steps needed to automate benchmark tests for Amazon Aurora PostgreSQL.

Steps to Automate Benchmark Tests

The first step in automating benchmark tests is to set up the benchmark environment. This includes connecting to the database and running the benchmark. To ensure benchmark tests are accurate, you should also set parameters such as the number of concurrent users, the type of queries, and the amount of data.

Once the environment is set up, you can start the benchmark tests. You’ll need to run the query multiple times to get accurate results. You can also use AWS services such as Amazon CloudWatch or Amazon EventBridge to monitor the results.

Finally, you can analyze the results of the benchmark tests. This includes looking at the performance of the database, the cost and scalability of the database, and the security and reliability of the database.


By automating the benchmark tests for Amazon Aurora PostgreSQL, you can ensure that your database is optimized for your application workloads. With the right environment, query parameters, and monitoring tools, you can easily run benchmark tests and analyze the results.

At KeyCore, we have extensive experience with automating benchmark tests for Amazon Aurora PostgreSQL. Our team of certified AWS professionals can help you set up the benchmark environment, run the tests, and analyze the results. Contact us today to learn more about how we can help with your database benchmarking needs.

Read the full blog posts from AWS

AWS for Games Blog

Exploring AWS GameTime – a New Way to Engage Your Players

AWS GameTime is a new game streaming show on Twitch, dedicated to bringing insights, advice and best practices to you by connecting you to the gaming experts at Amazon Web Services (AWS). Contributors to the show include Camille Jubin, an Account Executive for France with AWS for Games; Darren Ko, a Solutions Architect from the UK/Ireland; Darshi McKenzie, an Account Executive for AWS for Games; Florencia Huart, a Senior Account Manager for AWS for Games; Taras Mogetich, an EMEA Senior Business Development Representative for AWS for Games; and Tristan Greaves, a Solutions Architect from AWS.

The show is hosted by Richard Lattle, a Solutions Architect from AWS. During the show, experts from a variety of gaming industries give advice to players on how to use AWS to get the most out of their games.

The show covers a variety of topics related to gaming, such as improving performance, game optimization, and game monetization. In addition, the show also takes a look at new trends in gaming, such as cloud gaming and streaming. The show also provides advice on the best practices for launching and managing games on AWS.

In addition to the show, AWS GameTime also provides a library of resources that gamers can use to learn more about AWS and how to optimize their gaming experience.

At KeyCore, we understand how important performance and monetization are for game developers. We offer solutions to help developers reduce costs, optimize their games for faster performance, and ensure their games are compliant with the AWS platform. Our team of experts can help you evaluate and implement the right AWS services to power your games. Contact us today to learn more about how KeyCore can help you get the most out of your game.

Read the full blog posts from AWS

AWS Training and Certification Blog

April 2023 saw the release of 15 digital training products on AWS Skill Builder, including AWS Jam Journey: Security re:Invent and a new AWS Builder Lab for AWS Skill Builder subscribers, and three learning plans (with digital badges) for cloud essentials, Media and Entertainment, and Amazon Connect. Additionally, the AWS Certified Security – Specialty exam was updated, and learners who complete all 12 assignments in AWS Cloud Quest: Cloud Practitioner can earn a 25% off exam voucher for AWS Certified Cloud Practitioner.

transitioning from no cloud experience to achieving 3 AWS Certifications

Anastasiia, a Ukrainian refugee, transitioned from her small business (a beauty salon) to build the skills and AWS Certifications to start a new career in cloud DevOps. Her journey can be a valuable example for those seeking to make a similar transition.

Anastasiia started by researching and becoming familiar with the cloud, and soon decided to focus on AWS. She then decided to aim for three AWS Certifications: AWS Certified Cloud Practitioner, AWS Certified Solutions Architect – Associate and AWS Certified Solutions Architect – Professional.

Anastasiia utilized ITSkills4U to prepare for the certifications. The resources she used included AWS whitepapers, AWS documentation, Udemy courses, AWS Training and Certification practice exams, and the AWS Learning Library. She also made use of AWS Forums, AWS Builders and AWS User Groups to answer questions, network, and share experiences.

10 study resources for the AWS Certified: SAP on AWS – Specialty exam

The AWS Certified: SAP on AWS – Specialty exam is intended for individuals in roles that require experience with both SAP and AWS. It is designed to validate the candidate’s ability to design, implement, migrate, and operate the broad range of SAP workloads on AWS, adhering to the AWS Well-Architected Framework and SAP certification and support requirements.

Here are 10 study tips to help you prepare for the AWS Certified: SAP on AWS – Specialty exam:

  • Read the AWS Whitepapers
  • Understand AWS Well-Architected Framework
  • Attend AWS Trainings
  • Join AWS User Groups
  • Follow AWS Blogs and Social Media
  • Familiarize yourself with the Certification Exam
  • Understand the Exam Objectives
  • Create a Study Plan
  • Practice with the AWS Learning Library
  • Attempt AWS Practice Exams

KeyCore can help with AWS training and certifications

At KeyCore, we provide advanced professional services and managed services for AWS cloud deployments. Our team is highly experienced in AWS, with a deep understanding of cloud security, infrastructure, architecture, and DevOps best practices.

We offer tailored AWS training and certification programs, designed to help your team build the necessary skills to leverage the power of the cloud. Our training courses cover all levels of expertise and are suitable for both novice and experienced professionals. We also offer AWS Certification support, providing guidance and mock exams to help candidates prepare for the exams and increase their chances of success.

Reach out to learn more about our AWS training and certification services and how we can help your team become AWS certified.

Read the full blog posts from AWS

Microsoft Workloads on AWS

Microsoft Workloads on AWS

Do AWS Customers Benefit from 64KB Block Size for SQL Server Storage?

Many people believe that 64KB block size is beneficial for Microsoft SQL Server performance on Amazon Web Services (AWS). To test this belief, this blog post will compare the performance of SQL Server on Amazon EBS and Amazon FSx for Windows File Server using block sizes from 16KB to 64KB.

The tests compared the write performance and the cost of storage. It was found that when used with Amazon EBS, 64KB block size delivered good performance, while the cost of storage was slightly higher. And with Amazon FSx, although there was some improvement in performance, the cost of storage was considerably higher.

Overall, the results indicated that 64KB block size does offer some benefit for SQL Server performance, but that the associated cost increase may not be worth it for some users.

Optimizing Performance and Reducing Licensing Costs: Leveraging AWS Compute Optimizer for Amazon EC2 SQL Server Instances

AWS Compute Optimizer has recently added a new feature that uses machine learning to detect Microsoft SQL Server workloads running on Amazon EC2. This feature helps identify and optimize these workloads, providing an opportunity to improve performance and reduce licensing costs.

Rightsizing is one of the key components of optimizing performance and costs. AWS Compute Optimizer can help identify the most optimal instance type for your SQL Server workloads with respect to memory, storage, and compute power.

Moreover, AWS Compute Optimizer can help you optimize your license costs by making sure that you are running the appropriate SQL Server Enterprise edition. This ensures that you are not spending more money than necessary on licenses.

KeyCore can help you optimize your SQL Server workloads running on AWS with AWS Compute Optimizer. We can provide guidance on how to use the tool, and assist you in acquiring the necessary resources. Contact us today to learn more.

Read the full blog posts from AWS

Official Big Data Blog of Amazon Web Services

Amazon Web Services: Overview of Recent Updates

Amazon OpenSearch Service Under the Hood: Multi-AZ with Standby

Amazon OpenSearch Service now offers Multi-AZ with Standby, a deployment option designed for business-critical workloads. This option provides improved reliability and helps simplify cluster configuration and management. It also makes clusters more resilient to infrastructure failures like hardware or networking failure, and helps achieve 99.99% availability.

Perform Secure Database Write-Backs with Amazon QuickSight

Amazon QuickSight is a scalable, serverless, ML-powered business intelligence solution to connect to data, create interactive dashboards, access ML-enabled insights, and share visuals and dashboards with users. You can now use QuickSight for secure database write-backs.

Ten New Visual Transforms in AWS Glue Studio

AWS Glue Studio makes it easy to create, run, and monitor ETL jobs in AWS Glue. You can now use it to compose data transformation workflows with nodes to represent different data handling steps. AWS Glue Studio recently added ten new visual transforms.

Use SAML Identities for Programmatic Access to Amazon OpenSearch Service

Customers of Amazon OpenSearch Service can now use SAML identities for programmatic access. This applies to identity providers that support SAML 2.0, including ADFS, Okta, and AWS Single Sign-On (SSO).

Scale Your AWS Glue for Apache Spark Jobs with New Larger Worker Types G.4X and G.8X

Hundreds of thousands of customers use AWS Glue, a serverless data integration service, for analytics, ML, and application development. Now, you can use larger worker types G.4X and G.8X to scale your AWS Glue for Apache Spark jobs. Each DPU provides 4 vCPU, 16 GB memory, and 64 GB disk.

New Scatter Plot Options in Amazon QuickSight to Visualize Your Data

Scatter plots are a powerful visual type to identify patterns, outliers, and the strength of relationships between variables. Amazon QuickSight now offers scatter plot features to take your correlation analysis to the next level.

KeyCore Can Help!

At KeyCore, we provide professional and managed services for AWS users. Our team of experts can help you take full advantage of these features to get the most out of your data analysis and cloud solutions. Contact us today to learn more!

Read the full blog posts from AWS

Networking & Content Delivery

Networking & Content Delivery Track Sessions at AWS Summit Washington DC 2023

The AWS Summit in Washington, DC returns in 2023, with a program tailored to the needs and interests of the public sector. This post breaks down the Networking track’s Breakout, Chalk Talks, Builder’s session, and Workshop sessions, so you can plan your agenda. Taking place in-person at the Walter E. Washington Convention, the Summit provides an opportunity to learn about the latest updates and breakthroughs in cloud technology.

Breakout Sessions

The Breakout sessions delve into the latest AWS Networking and Content Delivery innovations. Covering topics from the basics of Amazon VPC, to advanced network architectures, the Breakout sessions offer comprehensive coverage of the AWS Networking and Content Delivery services. Attendees will learn how to build secure and resilient cloud networks, and how to extend their network and content delivery capabilities with AWS.

Chalk Talks

The Chalk Talks provide an opportunity to learn directly from the experts. Led by AWS Solutions Architects, these interactive sessions focus on the core features of AWS Networking and Content Delivery services. Attendees will discuss real-world use cases, best practices, and solutions to common challenges.

Builder’s Session

The Builder’s session takes a deep dive into hands-on exercises and workshops. Attendees will get the chance to build their own solutions using Amazon VPC, Amazon Route 53, AWS Transit Gateway, and the AWS Networking and Content Delivery services. The Builder’s session is a great way to gain experience in the latest solutions and services.

Workshop Sessions

The Workshop sessions are designed to provide attendees with the tools and resources needed to take their AWS Networking and Content Delivery solutions to the next level. Attendees will learn how to design and build complex network architectures, how to improve single-page application (SPA) performance with a same domain policy using Amazon CloudFront, and how to deploy and manage AWS Networking and Content Delivery solutions.

At KeyCore, our AWS consultants can advise you on the best solutions and services for your network and content delivery needs. We have extensive experience in designing and deploying cloud solutions, and can help you take advantage of the latest innovations. Contact us today to learn more.

Read the full blog posts from AWS

AWS Compute Blog

Automating Stopping and Starting AWS MWAA Environments to Reduce Costs

Amazon MWAA provides a powerful, managed environment for running Apache Airflow-based workloads. By automating the stopping and starting of your Amazon MWAA environment, you can save on costs. This post from Uma Ramadoss and Chandan Rupakheti shows you exactly how to do this.

First, you’ll need to structure your environment so that the data stored in the metadata database is retained when the environment is stopped. This can be done by creating an Amazon DynamoDB table and setting up the appropriate IAM roles.

Next, you’ll need to create an automated system to stop and start your Amazon MWAA environment. The easiest way to do this is to use Amazon CloudWatch Events to trigger a Lambda function that stops and starts your environment. You can also use AWS Step Functions to create a more complex workflow.

Finally, you’ll need to test your environment to make sure it works as expected when stopped and started. This can be done by manually testing the environment or by automating the testing process.

Monitoring SNS-Based Applications End-to-End with AWS X-Ray Active Tracing

Amazon Simple Notification Service (SNS) is a powerful messaging service that can be used to power many-to-many microservices and event-driven serverless applications. Thanks to active tracing with AWS X-Ray, you can now monitor SNS-based applications end-to-end.

Daniel Lorch and David Mbonu show how to enable active tracing with AWS X-Ray. This involves setting up an AWS X-Ray tracing client, configuring the SNS service, and creating an IAM policy to grant the necessary permissions. Once the tracing has been set up, you can use the X-Ray service map to view the flow of data between the different components of your application.

Debugging SnapStart-Enabled Lambda Functions Made Easy with AWS X-Ray

Lambda SnapStart is a performance optimization that significantly improves the cold startup times for your functions. It does this by pre-warming the function before it’s actually executed.

Rahul Popat and Aneel Murari from AWS X-Ray show how you can now use AWS X-Ray to debug SnapStart-enabled Lambda functions. This involves setting up the X-Ray SDK, configuring the Lambda function to use the X-Ray tracing client, and enabling SnapStart for the function. Once this is done, you can use the X-Ray service map to view the flow of data within your application.

Implementing Cross-Account CI/CD with AWS SAM for Container-Based Lambda Functions

Deploying containerized applications across different environments can be a challenging task. AWS Serverless Application Model (SAM) Pipelines can be used to create a CI/CD deployment pipeline and deploy a container-based Lambda function across multiple accounts.

This post from AWS explains the process. First, you’ll need to create the necessary resources, such as the IAM roles, S3 buckets, and SAM pipelines. Next, you’ll need to configure the SAM pipeline, including setting up the deployment stages and configuring the appropriate environment variables. Finally, you’ll need to create a build plan to build the Docker image and push it to the registry.

AWS Nitro System Gets Independent Affirmation of Its Confidential Compute Capabilities

Keeping workloads secure and confidential is a priority for AWS and its customers. AWS Nitro System is a key part of this effort, and it’s now been independently affirmed for its confidential compute capabilities.

Anthony Liguori, AWS VP and Distinguished Engineer for EC2, explains how AWS has innovated on security, privacy tools, and practices to meet and exceed customer expectations. This includes using the Nitro System to create a secure and isolated environment for running customer workloads. With the independent affirmation of its capabilities, customers can now have greater confidence in the security of their workloads.

At KeyCore, our AWS consultants can help you take advantage of the Nitro system and use it to secure and isolate your workloads. We offer professional services and managed services that can help you quickly and safely deploy your workloads. Contact us to learn more.

Read the full blog posts from AWS

AWS for M&E Blog

How AWS Media and Entertainment Solutions are Advancing the Live Video Streaming Experience

Automatically Stopping AWS Elemental MediaLive Channels

AWS Elemental MediaLive is a fast, reliable, and easy-to-use live video streaming service that enables media organizations and companies to deliver high-quality streams without the need to manage infrastructure. It simplifies live video operations by automating the configuration and management of encoding and ingest components for processing and delivery. As of today, there is a new feature that allows MediaLive channels to automatically stop when no input is detected. This is done by setting a maximum idle time and input loss duration. Once these times are reached, the channel will automatically transition to a stopped state.

Behind the Scenes of Creating an NFL Schedule

Building an 18-week 272-game schedule spanning 576 possible game windows in just three months is no easy task. But this is exactly what the National Football League (NFL) manages to do. How do they do it? After speaking with NFL executives, they revealed that they use a combination of AWS services, including AI, ML, analytics, compute, database, and storage services. They also leverage predictive analytics to optimize the schedule and anticipate what fans will want to watch in December.

How the Bundesliga Leverages AWS for a Better Viewing Experience

Germany’s premier national football league, Bundesliga, is enhancing the fan experience by leveraging AWS. By using AI, ML, analytics, compute, database, and storage services, the Bundesliga is able to deliver real-time statistics to improve insights into game strategies and outcomes. Furthermore, personalized content is recommended to fans and partners across several channels. Alexander Altenhofen, Bundesliga’s Director of Product & Technology, states that the combination of AWS and their sports data platform, resulted in a revolutionary platform with the most advanced technologies.

Immersive Viewing of the NASA Artemis 1 Launch

Live streaming in multiple formats to multiple device types can be a complex and time-consuming process. Futuralis, Felix & Paul Studios, and AWS partnered together to deliver an immersive viewing experience of the NASA Artemis 1 launch. They used a combination of AWS services, such as Amazon Kinesis Video Streams and AWS Elemental MediaLive and MediaConvert, to reduce complexity and cost while delivering the highest quality viewing experience.

How KeyCore Can Help

KeyCore is the leading Danish AWS consultancy and provides professional services and managed services. Our expert team of AWS consultants and AWS Certified Solutions Architects can help you leverage the power of AWS to build, deploy, and manage the highest quality live video streaming services. Contact us today to learn more about how we can help you get the most out of AWS.

Read the full blog posts from AWS

AWS Storage Blog

Upgrading from CloudEndure Disaster Recovery to AWS Elastic Disaster Recovery

What is AWS Elastic Disaster Recovery?
AWS Elastic Disaster Recovery (DRS) is the recommended service for disaster recovery to AWS. It is the next generation of CloudEndure Disaster Recovery (CEDR), as CloudEndure Disaster Recovery technology was used to build Elastic Disaster Recovery.

Using AWS Systems Manager to Upgrade Replicating Source Servers
It is now possible to upgrade replicating source servers from CloudEndure Disaster Recovery to Elastic Disaster Recovery via the CEDR Server Migration feature which is part of AWS Systems Manager. This feature automates the process of migrating servers from CloudEndure Disaster Recovery to AWS Elastic Disaster Recovery, including transferring all existing recovery settings, such as target Virtual Private Cloud (VPC) and Subnets, availability zones and instance types.

Benefits of Migrating from CloudEndure Disaster Recovery to AWS Elastic Disaster Recovery
Migrating from CloudEndure Disaster Recovery to AWS Elastic Disaster Recovery has several benefits. Not only does Elastic Disaster Recovery use CloudEndure Disaster Recovery technology, but it also offers additional features such as improved automation, automatic failover, and support for multiple disaster recovery sites. Additionally, AWS Elastic Disaster Recovery is more cost-effective than CloudEndure Disaster Recovery and is designed to help organizations quickly recover their IT infrastructure in the event of a disaster.

KeyCore Can Help
At KeyCore, we provide professional services and managed services to help organizations migrate from CloudEndure Disaster Recovery to AWS Elastic Disaster Recovery. Our team of AWS certified solutions architects have the knowledge and expertise to assist with the migration process and ensure that it is done correctly and efficiently. Contact us today to learn more about how we can help you upgrade from CloudEndure Disaster Recovery to AWS Elastic Disaster Recovery.

Read the full blog posts from AWS

AWS Architecture Blog

Making Oracle Data Guard Environments Easier to Connect to with Amazon Route 53 CNAME Records

Due to their resiliency, performance, and scalability, customers often choose Amazon Web Services (AWS) to run their Oracle database workloads. To ensure the architecture meets Service Level Agreements (SLAs), high availability (HA) solutions must be considered when migrating or deploying Oracle databases in AWS.

Amazon Route 53 CNAME records can help improve the connectivity of Oracle Data Guard environments. CNAME records are used to specify that a domain name is an alias of another domain, usually the domain of the targeted destination. When used in combination with Oracle Data Guard, CNAME records can provide hard-coded references to primary and standby Database instances, allowing customers to use the same connection string between the two.

Using CNAME records also simplifies the connection string configuration for the customer. It gives customers the ability to point their applications to a single connection string, which will automatically resolve to the Oracle Database instance that is actively serving.

With Amazon Route 53 CNAME records, customers can simplify their architectures, reduce complexity, and reduce the amount of time needed for database-level reconfiguration when switching between primary and standby instances.

Designing Serverless Solutions with AWS

During the AWS re:Invent 2022 keynote, Werner Vogels (AWS Vice President and Chief Technology Officer) discussed the asynchronous nature of our world and the struggles associated with incorporating asynchronicity into our architectures. To help customers focus on the asynchronous elements of their workloads, AWS serverless services make it easier to execute event-driven architectures and enable the adoption of serverless architectures.

AWS serverless services also allow customers to save on cost and time. With serverless solutions, users can focus on creating and deploying applications, rather than allocating resources and configuring compute instances. AWS serverless services can provide a cost-effective and time-efficient solution for customers who need to quickly develop applications with the ability to scale.

With AWS serverless, customers can also reduce complexity. Serverless architectures do not require the customer to manage servers or containers, thus reducing the amount of time spent on server maintenance and troubleshooting.

KeyCore can help customers start their migration to an AWS serverless architecture. Our Professional Services and Managed Services offerings provide comprehensive guidance and support to customers looking to move to a serverless solution. Our team consists of certified AWS experts who will help customers transition their applications successfully, so they can take advantage of the scalability and cost savings that come with the serverless model.

Read the full blog posts from AWS

AWS Partner Network (APN) Blog

Learn Why You Should Run SAP Business Technology Platform on AWS and More about AWS Partner Network Blogs

SAP Business Technology Platform (SAP BTP) is a unified, business-centric, and open platform for the entire SAP ecosystem. It’s available on AWS and enables users to integrate and create value from data and extend their SAP and third-party solution landscapes to meet evolving business needs. Customers who run SAP on AWS benefit from the lowest total cost of ownership and fastest time to value. This article outlines why customers should run SAP BTP on AWS.

Integrating Malware Scanning into Your Data Ingestion Pipeline

Cloud Storage Security (CSS) offers Antivirus for Amazon S3, a self-hosted malware solution installed in the customer’s AWS account so data doesn’t leave the customer’s AWS account. This article explains how to easily scan workloads using Antivirus for Amazon S3, and how to integrate malware scanning into the data ingestion pipeline. CSS is an AWS Security Competency Partner, helping customers prevent the spread of malware and locate sensitive data for applications and data lakes that use AWS managed services.

Announcing the 2023 AWS Partner Award Winners in ASEAN

AWS Partners in ASEAN have been recognized for helping customers accelerate innovation, develop industry-focused solutions, and build resilience amid a changing economic climate. As businesses continue to navigate economic headwinds, AWS Partners play a pivotal role as strategic advisors and technology experts in helping customers innovate, reduce costs, and build industry-focused solutions on AWS.

Building Production-Grade Kubernetes Clusters with Amazon EKS Anywhere on Nutanix

Amazon EKS Anywhere allows customers to run containerized workloads on customer-managed infrastructure. Nutanix enhances the list of deployment options for EKS Anywhere customers, which already includes bare metal servers, VMware vSphere, and Docker. AWS collaborated with Nutanix to integrate Amazon EKS Anywhere with the Cluster API provider for Nutanix to provide customers with declarative, Kubernetes-style APIs for cluster creation, configuration, and management.

Responsive Event-Driven Architectures on AWS for Reduced Costs and Improved Agility

Event-driven architectures make building cloud applications easier by creating, detecting, consuming, and reacting to multiple events in real time. DXC Technology helped a customer in the energy industry collect and push events from electricity meters using event-driven architecture. This approach provides better scalability, fault tolerance, and faster development as application complexity increases.

Accelerate Business Changes with Apache Iceberg on Dremio and Amazon EMR Serverless

Leveraging Apache Iceberg capabilities with Dremio and Amazon EMR Serverless can help organizations scale their business by keeping up with changes to their data and analytics portfolios. Iceberg is a high-performance, open table format for huge analytical tables designed to mitigate the challenges of unforeseen changes observed by enterprises. Dremio is a data lake engine that delivers fast query speed and a self-service semantic layer operating directly against Amazon S3 data.

How KeyCore Can Help

KeyCore provides AWS users with both professional services and managed services to help them build and deploy their applications and IT infrastructures. Our team of certified AWS consultants can provide guidance on best practices for running SAP BTP on AWS, integrating malware scanning into your data ingestion pipeline, leveraging Apache Iceberg capabilities, and more. With KeyCore, organizations can ensure their applications and IT infrastructures are built and managed with best practices, resulting in the best possible performance and cost savings.

Read the full blog posts from AWS


Deploying a Level 3 Digital Twin Virtual Sensor with Ansys on AWS

Why Level 3 Digital Twins are Needed for Virtual Sensors

Level 3 digital twins are needed to provide the necessary data to gain a deep level of intelligence in virtual sensors. By leveraging this technology, users can gain more insight into the data they are collecting, by using the virtual sensor to create simulations that can be used to gain a better understanding of the environment they are measuring. This data can be used to optimize the processes or products being measured, as well as help prevent any problems or errors from occurring.

Additionally, Level 3 digital twins can be used to create virtual simulations in the cloud. This can include simulated environments, where users can interact with the data in real-time and make decisions based on what they observe. This technology can be used to test out potential improvements to the system and make sure that any changes are in line with the desired outcomes.

Deploying a Level 3 Digital Twin on AWS

AWS provides tools to help customers deploy level 3 digital twin virtual sensors on the cloud. First, using the AWS CloudFormation, customers can create custom application templates that will help them quickly and easily provision the necessary resources for their digital twin. This includes computing resources, storage, and networking components that are necessary for the virtual sensor. Additionally, users can leverage the AWS SDK for JavaScript v3 to make API calls to the AWS services, so that they can get real-time information about the state of their digital twin and the environment it is deployed in.

Finally, customers can use the Ansys Simulation software to create the virtual sensor. Ansys provides a range of tools for creating virtual simulations with a high degree of accuracy. Once the simulation is created, it can then be deployed to the cloud via AWS. This will allow customers to scale their level 3 digital twin virtual sensor to the size they require, and to update it whenever necessary.

KeyCore’s Help with Level 3 Digital Twin Virtual Sensors

At KeyCore, we provide professional services and managed services that can help customers deploy and manage a Level 3 Digital Twin Virtual Sensor on AWS. Our experts can help with the CloudFormation templates, the AWS SDK for JavaScript v3, and the Ansys Simulation software. Additionally, our team is available to provide ongoing support, so that customers can always stay on top of the latest technology and ensure their digital twin is running optimally. Contact us today to learn more about how we can help you get the most out of your level 3 digital twin virtual sensor on AWS.

Read the full blog posts from AWS

AWS Cloud Operations & Migrations Blog

AWS Cloud Operations & Migrations: Making the Most of AWS Config, PCI DSS Compliance, AWS Chatbot, Service Catalog & CloudTrail Lake

AWS Config is a powerful service to help customers track and log configuration changes of AWS resources within their account. When changes occur, the configuration recorder in AWS Config creates a configuration item. This can help customers understand and control their costs, as well as stay compliant with industry standards. To help customers achieve and sustain PCI DSS compliance, AWS offers a range of services, such as AWS Service Catalog, CloudTrail Lake, and AWS Chatbot. Additionally, AWS has recently been recognised as a leader in the 2023 ISG Provider Lens for Mainframe Application Modernization Software.

Estimating AWS Config Recorder Costs & Usage

To help customers better understand their AWS Config costs and usage, they can use AWS CloudTrail. This service helps customers track and monitor API activity within their account. Additionally, CloudTrail can be used to set up alarms and alerts, search and visualize events, and support governance, compliance, and operational auditing. By understanding their AWS Config costs, customers can better manage the resources they use.

Aligning Business & IT to Achieve & Sustain PCI DSS Compliance

The Payment Card Industry Data Security Standard (PCI DSS) is a set of requirements for organizations that store or process cardholder data. To help customers meet these requirements, AWS offers a range of services and solutions. For example, AWS Service Catalog helps customers control IT service deployments, while CloudTrail Lake and AWS Chatbot help customers monitor and analyze activity in real-time. With the right combination of services, customers can align their business and IT goals to achieve and sustain PCI DSS compliance.

Automating Event Logs with AWS Systems Manager

AWS Systems Manager automates the ingesting of event logs from managed nodes into AWS CloudTrail Lake. A recent update allows customers to query event logs at scale, helping them to gain real-time visibility into their infrastructure and detect any potential security threats. This can help customers respond to incidents quickly and minimize potential downtime.

AWS Recognized as a Leader in Mainframe Application Modernization Software

AWS has recently been recognised as a leader in the 2023 ISG Provider Lens for Mainframe Application Modernization Software. This report assesses the capabilities of vendors offering mainframe application modernization software and services to enterprises, and highlights that AWS Mainframe Modernization service is an ideal solution for customers looking for a comprehensive approach to modernize their mainframe applications.

Reporting and Visualizing Your AWS Service Catalog Estate

AWS Service Catalog helps customers create and manage catalogs of IT services that are approved for use on AWS. This includes virtual machine images, servers, software, and databases. With Service Catalog, customers can centrally manage the resources and metadata associated with their IT services. Additionally, customers can use Service Catalog to report and visualize their IT estate.

Reducing Incident Management Response Times

AWS Chatbot can help customers reduce incident management response times for their container workloads. This service provides real-time visibility into performance issues, traffic spikes, infrastructure events and security threats, enabling customers to act quickly. With the right combination of services, customers can minimize the downtime of their mission-critical container workloads.

KeyCore Can Help

At KeyCore, we understand that AWS customers need to stay compliant with industry standards, optimize their costs, and respond to security threats quickly. Our team of AWS Certified professionals can help customers make the most of their AWS services, such as AWS Config, Service Catalog, CloudTrail Lake, and AWS Chatbot. Contact us today to learn how we can help you.

Read the full blog posts from AWS

AWS for Industries

AWS for Industries

Exberry Built a Cloud-Native Matching Engine on AWS

Exberry is a cloud-native matching engine platform provider that helps traditional and alternative exchanges launch ultra-low latency markets quickly and cost-effectively. Exberry’s platform utilizes AWS to process up to 1 million trades per second, with latencies of just 20 microseconds. By leveraging the power of the cloud, Exberry is able to provide far better performance than would otherwise be possible.

Vodafone Leverages AWS and Broadband Forum User Service Platform for CPE Management

Vodafone Group Plc is a global communications company that provides mobile, fixed, TV, IoT, cloud, and security services to consumers and enterprises. To keep up with the demand, Vodafone is leveraging the power of AWS and the Broadband Forum’s User Service Platform (USP) standard to re-architect the management of its Customer Premise Equipment (CPE). This allows Vodafone to remain more adaptive to changes in the market, while also improving their operational efficiency.

The Future of Manufacturing for Small and Medium Businesses

The coming decade presents the manufacturing sector with a great deal of challenges, as well as opportunities for growth and innovation. To stay ahead of the competition, manufacturers need to be as flexible and resilient as possible. This is especially true in times of economic uncertainty. Forrester’s research shows that, in order to remain competitive, manufacturers need to embrace automation and advanced technologies such as artificial intelligence (AI) and the Internet of Things (IoT).

AWS provides a range of services that can help manufacturers of all sizes. For instance, Amazon SageMaker enables manufacturers to easily build, train and deploy machine learning models, while Amazon Kinesis can help collect, process and analyze real-time streaming data.

By leveraging the power of AWS, manufacturers can quickly and easily automate their processes and become more competitive. At KeyCore, we provide both professional services and managed services to help companies of all sizes move to the cloud and make the most of its capabilities. Contact us today to find out how we can help your business.

Read the full blog posts from AWS

AWS Marketplace

How to Use AWS Marketplace in AWS GovCloud

AWS GovCloud (US) is a secure cloud computing environment specifically designed to meet the requirements of US government agencies and contractors. AWS GovCloud allows customers to leverage the same secure and trusted services available in other AWS Regions, but with additional controls and compliance programs tailored to meet the unique US government requirements.

Why Use AWS Marketplace in AWS GovCloud?

Using AWS Marketplace in GovCloud allows customers to quickly and easily purchase software without having to manage the purchasing process. Customers can select from a wide range of software and services provided by independent software vendors (ISVs) on AWS Marketplace and subscribe to the products with one click. This makes it easier for customers to get the software they need quickly and securely.

How to Provision Software Available in AWS Marketplace in AWS GovCloud

To purchase and provision software available in AWS Marketplace, customers first need to log in to the AWS GovCloud console. From there, users can search for the product they want by clicking on the “Software” tab in the AWS Marketplace. Once the product has been identified, users can click on the “Subscribe” button to subscribe to the product and start the provisioning process.

Once the subscription is complete, users can start setting up the software using the instructions provided by the vendor. The setup process can be completed in minutes, and users can start using the product right away.

Managed Entitlements and Private Offers

AWS Marketplace also offers managed entitlements and private offers in AWS GovCloud. Managed entitlements allow customers to purchase software and services in bulk, while private offers enable customers to purchase software and services directly from ISVs.

Managed entitlements allow customers to manage multiple subscriptions from a single account and make payments with a single payment method. Private offers enable customers to purchase software and services from an ISV and pay for them directly. Private offers can be used to meet specific customer requirements, such as special terms, discounts, or additional services.

How KeyCore Can Help

KeyCore can help customers get the most out of their AWS GovCloud experience by providing expertise on cost optimization and compliance, as well as helping customers to select the best software and services for their needs. We can also help customers manage their subscriptions and entitlements, enabling them to make the most of their software and services investments. Contact us today to learn more about how KeyCore can help you get the most out of AWS GovCloud.

Read the full blog posts from AWS

The latest AWS security, identity, and compliance launches, announcements, and how-to posts.

The Latest AWS Security, Identity, and Compliance Launches, Announcements, and How-To Posts

Walk Through AWS Verified Access Policies

AWS Verified Access helps improve an organization’s security posture by using trust providers to grant access to applications. It grants access only when the user’s identity and the user’s device meet configured security requirements. This post provides an overview of trust providers and policies, then walks through how to set up AWS Verified Access policies. It also explains the available client types and ways to leverage the control plane from a technical perspective.

Detect Threats to Data Stored in RDS Databases with GuardDuty

Amazon Relational Database Service (RDS) is a way to set up, operate, and scale a relational database in the cloud. It provides cost-efficient, resizable capacity for an industry-standard relational database and manages common database administration tasks. Amazon GuardDuty RDS Protection is a service that can detect threats to data stored in RDS databases. It can help customers remain secure by monitoring their RDS instances and sending security findings to an Amazon CloudWatch log stream.

Customer Checklist for eIDAS Regulation

AWS has published a checklist to help customers align with the requirements of the European Union’s electronic identification, authentication, and trust services (eIDAS) regulation. This covers electronic identification and trust services for electronic transactions in the European single market. The eIDAS regulation helps verify the identity of the user and the origin of the transaction, ensuring security and trust when it comes to electronic transactions.

A Sneak Peek at the Identity and Access Management Sessions for AWS re:Inforce 2023

AWS re:Inforce 2023 is fast approaching. This post provides a look at the sessions in the identity and access management track. AWS re:Inforce is a learning conference focused on identity and access management (IAM) technologies, cloud security, and compliance. A full conference pass is $1,099. KeyCore can help customers leverage these technologies, offering expertise on how to implement IAM strategies and ensure compliance.

Read the full blog posts from AWS

AWS Startups Blog

Evolutionary Architectures, part 4: Meeting Compliance Standards and Diversifying a Product Portfolio

Example Startup saw their idea of a “fantasy stock market” come to life, but they were now tackling the challenge of formalizing their security and backup posture to meet various compliance standards. In addition to this, they wanted to set a data strategy for the organization and explore additional lines of business to diversify their product portfolio.

Meeting Security, Backup, and Compliance Standards

The team at Example Startup realized that the security and backup posture of their system as well as compliance with industry standards, such as GDPR and PCI, was essential for their long-term success. To do this, they implemented AWS Config to maintain a secure, audited, and compliant environment, and configured AWS CloudTrail to audit their activity and ensure their environment stayed within compliance. Additionally, they set up AWS Shield for their web applications, which provided the added layer of protection from DDoS attacks. Finally, they implemented AWS Backup to centralize their backup and recovery needs and ensure that they were able to recover their data quickly in the event of an outage or disaster.

Creating a Data Strategy

Once Example Startup had established their security and backup posture, they began to look at data as a part of their overall strategy. They used Amazon Elasticsearch Service to create a single source of truth for their data, which enabled them to analyze large amounts of data quickly and easily. They also used Amazon Athena to provide data insights, allowing them to identify patterns and trends in their data that they could use to make better decisions. Finally, they implemented Amazon Redshift to store the data they collected and create a data lake for their organization.

Exploring Additional Lines of Business

With their security, backup, and data strategies in place, Example Startup began to look for ways to diversify their product portfolio. They used Amazon EC2 for compute needs and Amazon S3 for storage, giving them the flexibility to quickly add new services and products. Additionally, they explored AWS Marketplace for third-party services and applications. With these services in place, the team at Example Startup was able to quickly scale their services, experiment with new products and offerings, and stay ahead of the competition.

Announcing the startups selected for the AWS Impact Accelerator Latino Founders cohort

AWS is pleased to announce the 20 companies that make up the AWS Impact Accelerator Latino Founders Cohort. This 8-week program will provide these underrepresented founders with the resources, capital, and community needed to give them a competitive edge in the startup world. The program will kick off this week in Seattle and culminate in an investor pitching day in New York City – including the Nasdaq Closing Bell Ceremony.

Leveraging the AWS Cloud

The AWS cloud provides a number of advantages to startups. It provides scalability, flexibility, cost-effectiveness, and security, making it the perfect platform for these entrepreneurs to build and launch their products. By leveraging the cloud, these startups will be able to quickly develop and deploy their applications, helping them get to market faster and with greater agility. Additionally, AWS provides access to a host of cloud services, such as Amazon S3, Amazon EC2, and Amazon Athena, which will enable these companies to optimize their resources, reduce overhead costs, and remain competitive.

Accessing the Amazon Ecosystem

The AWS Impact Accelerator program provides these startups with access to the powerful Amazon ecosystem. This includes access to AWS services, such as Amazon Machine Learning and Amazon Rekognition, as well as to Amazon’s unparalleled eCommerce and logistics infrastructure. Additionally, the cohort will be given access to mentorship, investment opportunities, corporate partnerships, and more. All of this will enable them to benefit from the collective experience and resources of the Amazon ecosystem.

KeyCore’s Role in the AWS Impact Accelerator Program

At KeyCore, we are proud to support the AWS Impact Accelerator Program, and the 20 startups selected for the Latino Founders Cohort. As a professional AWS consultancy, KeyCore provides both professional and managed services to help our customers get the most out of the AWS cloud. Our mix of technical and business-oriented services ensures that our customers get the best possible experience from the AWS cloud. Whether it is helping with the migration of an existing application to the cloud, or building and managing a cloud native application, KeyCore is here to help.

Read the full blog posts from AWS

Front-End Web & Mobile

Automating Testing with Authentication using AWS Amplify and Cypress

Testing is a critical part of software delivery, and automating this process helps to ensure high-quality, quickly delivered results. By leveraging AWS Amplify and Cypress, developers can create automated end-to-end tests for their CI/CD deployments with Amplify Console.


Using AWS Amplify and Cypress, developers can automatically test their application from end-to-end with authentication. They can create tests that run after a build is submitted to Amplify Console, or have multiple Amplify Console builds running concurrently and trigger Cypress tests for each build. This automation helps to catch and fix errors before deployment, leading to better quality software with faster delivery.

Set Up

To set up automated end-to-end tests with authentication, developers need to create a dedicated user in the application. This user will be used in the Cypress test when running the automated tests. By creating a user specifically for testing, developers can ensure that no user credentials are exposed and that the application is kept secure.

Next, developers should create an AWS Identity and Access Management (IAM) role for the test user that grants the user access to the resources needed for the tests, such as S3 and DynamoDB. This IAM role should be created in the same account and region as the application. Once the IAM role is created, developers can use the AWS SDK for JavaScript v3 to authenticate their application using the IAM role and generate an access key for the test user.

Running Tests

Once the test user is set up, developers can use Cypress to run their tests. Cypress allows developers to write and run tests that cover the full functionality of their application. This includes testing the application’s UI, API, and authentication. When running the tests, developers can use the access key for the test user to authenticate and test the application’s authentication flows.

With Cypress, developers can also set up automated tests that run after a build is submitted to Amplify Console. This allows developers to continuously test their application and catch errors before deploying a new version of their application. Developers can also have multiple Amplify Console builds running concurrently, and trigger Cypress tests for each build. This allows developers to test different versions of their application in parallel.

The Benefits of Automated Tests

Automating the process of testing software allows developers to deliver high-quality software with faster delivery and reduced risk of errors. By leveraging AWS Amplify and Cypress to automate end-to-end tests with authentication, developers can ensure their applications are tested thoroughly and are secure before they deploy a new version. With automated tests, developers can also find any errors before they go into production, resulting in better quality software and quicker deployment.

KeyCore’s Role

At KeyCore, we understand the importance of automated testing in software delivery. Our team of AWS experts can help you set up automated tests with Authentication using AWS Amplify and Cypress. We have experience with setting up IAM roles, writing CloudFormation YAML scripts, and using the AWS SDK for JavaScript v3. Our team can help you automate the process of testing software and deploy high-quality applications quickly and securely.

Read the full blog posts from AWS

AWS Contact Center

Just Energy Powers Up Their Contact Center Innovation With Amazon Connect

Just Energy is a leading North American distributor of electricity and natural gas with over 20 years of experience. The company sought to move their contact center platform to the cloud with Amazon Connect to overcome the challenges they faced with their legacy platform. In this blog post, we’ll share how the change to Amazon Connect helped Just Energy power up their contact center innovation.

Simplified Contact Center Management

With Amazon Connect, Just Energy was able to streamline their contact center management. The cloud-based contact center platform enabled Just Energy to quickly add and remove agents, create new queues and routing rules, and adjust their contact center staffing in real-time to meet customer demand. This means they can respond more quickly to customer inquiries and provide better customer service.

Optimized Contact Center Performance

By leveraging Amazon Connect, Just Energy was able to optimize the performance of their contact center. Amazon Connect offered enhanced scalability, enabling Just Energy to handle more customer inquiries with the same number of agents. They also enjoyed improved system reliability and uptime, allowing them to deliver a better customer service experience and minimize disruption to their contact center operations.

Reduced Contact Center Costs

By switching to Amazon Connect, Just Energy was able to reduce their contact center costs. Amazon Connect enables organizations to pay only for the resources they need and use, rather than a flat fee for an entire system. This allows them to save money and better manage their contact center budget.

KeyCore Can Help

At KeyCore, we are experienced in helping our clients streamline and optimize their contact center operations with Amazon Connect. We provide professional services and managed services, including the setup and configuration of Amazon Connect. We can also provide ongoing maintenance and support.

If you’re looking for help setting up and managing your contact center, contact us today to get started.

Read the full blog posts from AWS

Innovating in the Public Sector

Innovating in the Public Sector with Data-Driven Strategies, Knowledge Bases and Cloud Centers of Excellence

Data-driven decisions are essential for the public sector to effectively respond to unexpected events and provide a better citizen experience. In this blog post, we discuss how to set up data-driven strategies, build a team knowledge base, and establish a cloud center of excellence, as well as how KeyCore can help.

Data-Driven Strategies

Data-driven strategies are essential for government organizations to make better decisions by putting data at the center of their decision-making process. By understanding citizen ambitions and requirements, public sector organizations will be able to create a better citizen experience.

Knowledge Bases with Amazon Lightsail

Having an organized system for common information, such as addresses, phone numbers, and meeting schedules, can prove extremely valuable for professors and their teams. Building this knowledge base on AWS with Amazon Lightsail can save hours of administration and maintenance time, while providing additional control and flexibility for remote access.

Cloud Centers of Excellence

As more organizations move toward cloud computing, many are looking for ways to make sure they’re using the cloud effectively and efficiently. Establishing a cloud center of excellence (CCoE) can help streamline cloud adoption, security and innovation needs, reduce costs and more.

KeyCore’s Expertise in AWS

As the leading Danish AWS consultancy, KeyCore provides professional and managed services to help organizations innovate in the public sector. Our experts are highly advanced in AWS and are familiar with CloudFormation YAML, AWS API calls, TypeScript, and AWS SDK for JavaScript v3. We have the experience and knowledge to help you set up data-driven strategies, build a team knowledge base, and establish a cloud center of excellence.

For more information, please visit and contact us today.

Read the full blog posts from AWS

AWS Open Source Blog

Using Open Source Cedar to Write and Enforce Custom Authorization Policies

Open source Cedar is a language for creating custom authorization policies for applications. KeyCore can help customers understand and leverage the power of Cedar to create authorization policies tailored to their unique needs.

The advantages of Cedar become clear when we look at a simple example application: TinyTodo. TinyTodo is a task management application that allows users to create and manage tasks. Cedar authorization policies can be used to specify which users are allowed to access, modify, or delete tasks.

Using Cedar, authorization policies can be written to support different levels of access for different users. For example, users can be granted access to view, add, or delete tasks. Additionally, users can be given permissions to modify task content, assign tasks, or manage user access. By making use of the Cedar authorization engine, TinyTodo can ensure that only users with the intended access are granted the appropriate permissions.

Announcing Snapchange: An Open Source KVM-backed Snapshot Fuzzing Framework

Today, AWS Find & Fix (F2) open source security research team announced the launch of Snapchange, an open source KVM-backed fuzzing framework designed to make it easier to identify and fix security vulnerabilities.

Snapchange is an open source fuzzing tool that makes use of KVM-backed snapshotting to identify and fix security vulnerabilities. By taking snapshots of a KVM guest, Snapchange can quickly generate new fuzzing test scenarios. Snapchange can be used to quickly locate and address previously unidentified security vulnerabilities, and is designed to detect memory corruptions, kernel memory corruptions, and privilege escalations.

Using Snapchange, AWS customers can quickly detect and fix security vulnerabilities in their applications. KeyCore provides the expertise and tailored services to help customers understand and leverage Snapchange to secure their applications.

Read the full blog posts from AWS

Scroll to Top