Summary of AWS blogs for the week of monday Mon Oct 02

In the week of Mon Oct 02 2023 AWS published 103 blog posts – here is an overview of what happened.

Topics Covered

Desktop and Application Streaming

Monitoring Amazon WorkSpaces Usage with Custom Reports

Amazon WorkSpaces provides various metrics to allow customers to monitor usage of their workload. These metrics are all emitted at the service level, so granular in-session metrics are not available. With the adoption of NICE DCV as the desktop streaming mechanism for the WorkSpaces Streaming Protocol (WSP), customers can take advantage of different levels of usage reporting.

WSP API and NICE DCV Usage Reports

The Amazon WorkSpaces Streaming Protocol (WSP) API provides customers access to various metrics, such as the number of active sessions and the amount of data transferred. This data is available from the WSP API so customers can generate custom reports. Customers can also access NICE DCV usage data, such as the number of connected users and the length of active sessions.

Benefits of Usage Reports

Having access to usage reports gives customers the ability to gain insights into their workload. This helps customers to better understand how their WorkSpaces are being used and also identify potential areas for improvement. Customers can also use the usage reports to gain visibility into the efficiency of their WorkSpaces deployments.

How KeyCore Can Help

At KeyCore, we offer a comprehensive suite of services for Amazon WorkSpaces. Our team of experienced AWS consultants can help you with the setup and configuration of your WorkSpaces as well as providing guidance on how to use the usage reports to gain insights into your workload. We also offer managed services for Amazon WorkSpaces, which can help ensure that your WorkSpaces are always running optimally.

Read the full blog posts from AWS

AWS DevOps Blog

Resource Isolation and Build & Deployment With AWS CDK & Amazon CodeCatalyst

When dealing with multiple projects and environments, managing resources can be a challenging task. The Amazon Web Services (AWS) Cloud Development Kit (CDK) simplifies the process of Infrastructure-as-Code (IaC). However, resource isolation can still be difficult. To help address this, we have a new experimental library, the App Staging Synthesizer (App Stage). App Stage enhances resource isolation in AWS CDK, providing a better modularized approach to resource management. App Stage also provides improved visibility into resource dependencies.

Benefits of App Staging Synthesizer

App Stage helps improve resource isolation by allowing developers to define their resources in stages. This enables developers to break up their resources into different stages, like Develop, Test, and Production, for example. Each stage can have its own set of resources assigned to it, making it easier to keep track of resource dependencies. App Stage also offers improved visibility into resource dependencies, providing a better view of which resources are shared between stages.

Deploy to Amazon EKS with CodeCatalyst

Amazon CodeCatalyst is an integrated service for software development teams that embrace continuous integration and deployment (CI/CD) practices. CodeCatalyst provides all the necessary tools for developing, building, and releasing software in one place. By integrating AWS resources with your deployment process, CodeCatalyst helps development teams quickly and easily deploy to Amazon Elastic Kubernetes Service (EKS), and take advantage of the scalability and availability of cloud computing.

KeyCore and Our End-to-End AWS Services

At KeyCore, we offer both professional and managed services for a wide range of AWS solutions. We are a leading AWS consultancy in Denmark and specialize in helping companies move their applications and data to the cloud. Our team of experienced AWS professionals have in-depth knowledge of the AWS platform and can provide end-to-end services, from strategy and design to deployment and optimization. If you are looking for help with resource isolation, build & deployment, or any other AWS related services, contact us today and let us know how we can help.

Read the full blog posts from AWS

AWS for SAP

Upgrading SAP Instances with SUSE Linux Enterprise Server

SAP customers who are running their workloads using SUSE Linux Enterprise Server (SLES) or SUSE Linux Enterprise Server for SAP Applications (SLES for SAP) must be aware of the different methods for upgrading their instances and the potential issues that may arise from upgrades if not done properly. This blog will provide an overview of the best practices for safely upgrading SLES to the next service pack, and performing a major version upgrade from SLES 12 to SLES 15.

Preparing for the Upgrade

Before upgrading any SAP applications, customers must ensure that their applications and hardware are compatible with the intended version of SLES. Customers should confirm that their current SAP application is compatible with the target version of SLES, and ensure that their hardware meets the requirements associated with the target version of SLES.

Performing the Upgrade

SLES upgrades are divided into two main categories: service pack upgrades and major version upgrades. Service pack upgrades are generally simpler and do not require a major version change, while major version upgrades require a more intensive process.

For service pack upgrades, customers will need to install the new version of SLES, perform a system update, and then reboot the system. For major version upgrades, customers will need to backup their existing system, install the new version of SLES, perform a system update, and then restore their existing data.

Avoiding Pitfalls and Ensuring System Stability

When upgrading SLES, customers must ensure that their systems are stable and properly configured. This includes ensuring that their hardware is compatible with the new version of SLES, as well as verifying that their SAP applications are compatible with the new version. Additionally, customers should also ensure that their system is properly backed up and all necessary data is restored after the upgrade is complete.

KeyCore’s Expertise

At KeyCore, we have the experience and expertise necessary to ensure that your SAP systems are properly upgraded with the latest version of SLES. Our team of experts provides comprehensive services from planning to execution, and can help you minimize the risks associated with upgrading your SLES instances. Contact us today to learn more about how we can help you upgrade your SAP systems with confidence.

Read the full blog posts from AWS

Official Machine Learning Blog of Amazon Web Services

Overview of Machine Learning Applications on Amazon Web Services

Personalize Generative AI Applications with Amazon SageMaker Feature Store

Using Amazon SageMaker Feature Store, it is possible to integrate user profiles and item attributes to generate context-aware personalized content recommendations using large language models (LLMs). This post will guide users through the process of integrating a feature store and an LLM to generate these personalized recommendations.

Build an Image-to-Text Generative AI Application using Multimodality Models on Amazon SageMaker

This post provides an overview of popular multimodality models and how to deploy them on Amazon SageMaker. It also discusses the various applications of these models, particularly on real-world scenarios such as zero-shot tag and attribution generation for ecommerce and automatic prompt generation from images.

Improve Prediction Quality in Custom Classification Models with Amazon Comprehend

Amazon Comprehend can be used to build and optimize custom classification models. This post demonstrates how to build a multi-label custom classification model using Amazon Comprehend and provides guidelines on how to prepare the training dataset and tune the model.

Fast and Cost-Effective LLaMA 2 Fine-Tuning with AWS Trainium

Large language models (LLMs) are used for a variety of applications such as question answering, summarization, translation, and more. Amazon SageMaker JumpStart now provides pre-trained Llama 2 models that can be used for these applications.

Simplify Medical Image Classification using Amazon SageMaker Canvas

Analyzing medical images plays a crucial role in diagnosing and treating diseases. Amazon SageMaker Canvas simplifies medical image classification, allowing healthcare professionals to quickly diagnose certain cancers, coronary diseases, and ophthalmologic conditions.

Create an HCLS Document Summarization Application with Falcon using Amazon SageMaker JumpStart

Amazon SageMaker JumpStart enables customers to use generative AI to get more from their data. This post explains how to use Amazon SageMaker JumpStart to build a document summarization application for healthcare and life sciences (HCLS).

Automate Prior Authorization using CRD with CDS Hooks and AWS HealthLake

Prior authorization is a complex process that requires a lot of time and effort. This post explains how to use CDS Hooks and AWS HealthLake to automate prior authorization.

Code Llama Code Generation Models from Meta are now Available via Amazon SageMaker JumpStart

Using Amazon SageMaker JumpStart, customers can now deploy Code Llama foundation models with one click. Code Llama is a large language model (LLM) capable of generating code and natural language from both code and natural language prompts.

Build an End-to-End MLOps Pipeline for Visual Quality Inspection at the Edge – Part 1/2/3

This series of posts explains how to build an end-to-end MLOps pipeline for a visual quality inspection use case at the edge. This pipeline automates the ML process from data labeling to model training and deployment at the edge. AWS IoT Greengrass is used to manage model inference at the edge.

At KeyCore, we specialize in helping customers develop, deploy, and scale ML pipelines on Amazon Web Services. Our team of experts can help you develop the best ML pipeline for your use case and guide you in leveraging the range of machine learning applications on AWS. Contact us today to learn more about how we can help you get the most out of your ML project.

Read the full blog posts from AWS

Announcements, Updates, and Launches

AWS Launches Data and Generative AI Day and Announces Updates and New Features on EC2 and DataZone Services

AWS is hosting a free-to-attend online event, Data and Generative AI Day, on Thursday October 5, 2023. The event will stream simultaneously over LinkedIn Live and YouTube. Generative AI unlocks the potential hidden in the organization’s data and makes it easier to meet unique business goals.

Compute Optimized C7a Instances

The compute optimized Amazon EC2 C6a instances were launched in February 2022 and featured 3rd Gen AMD EPYC (Milan) processors running at frequencies up to 3.6 GHz. Now, AWS has announced the general availability of the compute optimized Amazon EC2 C7a instances, powered by 4th Gen AMD EPYC (Genoa) processors with a maximum frequency of up to 4.0 GHz.

Amazon DataZone Now Generally Available

AWS has made Amazon DataZone generally available. This data management service allows users to catalog, discover, analyze, share, and govern data within their organization. At AWS re:Invent 2022, the service was preannounced and later previewed publicly in March 2023. DataZone makes it easier to collaborate on data projects across organizational boundaries.

KeyCore Can Help With Your AWS Needs

KeyCore, the leading Danish AWS consultancy, provides both professional services and managed services. Our staff are highly advanced in AWS, and we can help you with your needs. From creating blog entries and writing code snippets in CloudFormation YAML or AWS API Calls using Typescript and AWS SDK for JavaScript v3, to providing professional services and managed services, KeyCore is here to help.

Read the full blog posts from AWS

Containers

How to Optimize Container Applications on Amazon EKS

Optimizing WebSocket Applications with API Gateway

WebSocket is a common communication protocol used in web applications to enable real-time, bidirectional data exchange between the client and server. However, when the server must maintain a direct connection with the client, it can limit the server’s ability to scale down when there are long-running clients. When nodes become over-utilized, the server may not be able to provide the performance needed for the clients. To address this, AWS API Gateway can be used to manage WebSocket connections and enable the server to scale down when needed.

API Gateway can be used to route requests from clients to a fleet of WebSocket applications running on Amazon EKS. With API Gateway, each request is routed to a specific instance of an application, and the API Gateway service tracks the number of active connections. This allows the server to control the number of active connections and scale down when needed. Additionally, API Gateway provides authentication and authorization, so that only authorized clients can access the WebSocket applications.

Analyzing EKS Fargate Costs with Amazon Quicksight

AWS Fargate is a serverless compute engine for running Amazon Elastic Kubernetes Service (Amazon EKS) and Amazon Elastic Container Service (Amazon ECS) workloads without managing the underlying infrastructure. Fargate simplifies provisioning and scaling secure, isolated, and right-sized compute capacity for containerized applications, making it an attractive solution for teams.

However, without a proper cost analysis, teams may find that they’re not getting the most out of their Fargate resources. To help teams understand their Fargate costs, AWS provides Amazon QuickSight. This service provides teams with detailed cost analytics to help them make informed decisions about their use of Fargate. QuickSight can also be used to analyze the cost of other AWS services.

Starting Spring Boot Applications Faster on AWS Fargate

In order to quickly react to disruptions, fast startup times are essential to Spring Boot applications running on AWS Fargate. In order to optimize Spring Boot for Fargate, there are a number of optimization techniques that can be employed. One such optimization is using the Spring Boot Optimized Images (SOCI) open source project. SOCI enables teams to start a Spring Boot application in as little as 4 seconds.

SOCI is a Docker image that encapsulates a Spring Boot application and its dependencies into a single, lightweight container. This lightweight container is then deployed to AWS Fargate on startup. By using SOCI, teams can benefit from the advantages of a serverless compute engine and significantly reduce their application startup times.

Upgrading Amazon EKS Worker Nodes with Karpenter Drift

Karpenter is an open-source Kubernetes cluster autoscaler that provisions nodes of the right size in response to unschedulable pods. It does this based on aggregated CPU, memory, and volume requests, as well as other Kubernetes scheduling constraints (e.g., affinity and pod topology spread constraints). This makes it easier for teams to manage their infrastructure.

When using Karpenter as an autoscaler, all nodes in a node group must be the same size and configuration. To address this, Karpenter provides a feature called “drift” that allows for upgrades to nodes in a node group without having to delete and recreate the node group. With drift, teams can upgrade their workers nodes and maintain the same node group configuration.

Amazon EKS Extended Support for Kubernetes Versions

Amazon EKS now offers extended support for Kubernetes versions. Customers can now run Amazon EKS clusters on a Kubernetes version for a period of up to 26 months from the time that version is generally available on Amazon EKS. This Extended Support feature is currently available as a free preview for all Amazon EKS customers.

This Extended Support feature will allow customers to better manage their Kubernetes workloads and ensure their applications remain up-to-date with the latest security and performance enhancements. With Extended Support, customers can benefit from the peace of mind of having a longer-term, supported version of Kubernetes.

Using Shared VPC Subnets in Amazon EKS

Organizations have come to rely on shared Amazon virtual private clouds (VPCs) to simplify network administration and reduce costs. The use of Shared VPCs not only provides these advantages but also allows organizations to use subnets across multiple VPCs.

Amazon EKS now supports the use of Shared VPC subnets. This means customers can now leverage subnets across multiple VPCs for their EKS clusters. With Shared VPC subnets, customers can benefit from increased security and improved resource utilization. Additionally, customers can use Shared VPCs to simplify the management of multiple EKS clusters.

At KeyCore, we understand that leveraging the full potential of AWS can be a complex challenge. That’s why we offer both professional services and managed services to help our customers make the most of their AWS investments. Our experienced team of AWS experts can help you optimize your container applications for EKS, manage your Fargate costs, speed up Spring Boot applications, upgrade EKS nodes, and more. Contact us today to learn more about how we can help you get the most out of your AWS services.

Read the full blog posts from AWS

AWS Quantum Technologies Blog

Using Quantum Algorithms for Use Cases: A Comprehensive Guide

This comprehensive guide is designed for quantum computing researchers and customers who are looking to explore how quantum algorithms will apply to their use cases. It introduces the concept of constructing an “end-to-end” quantum algorithm, providing an overview of the different steps, and explains the complexities of using quantum algorithms.

Understanding Quantum Algorithms

Quantum algorithms are computing techniques that rely on the laws of quantum mechanics to significantly increase the speed and efficiency of certain calculations. Quantum algorithms are expected to become more widely used as quantum computing continues to develop and become more widely available.

Quantum Algorithms vs Classical Algorithms

Quantum algorithms can offer dramatic improvements over classical algorithms in certain tasks, such as factorization and optimization. Quantum algorithms can also help with problems that are impossible to solve with classical algorithms, such as simulating quantum systems.

Constructing an “End-to-End” Quantum Algorithm

Constructing an “end-to-end” quantum algorithm involves multiple steps, from problem definition to implementation and analysis. Each step has its own challenges and considerations, such as the choice of hardware and software, the design of the quantum circuit, and the optimization of the code.

Using Quantum Algorithms for Your Use Cases

The Quantum Algorithms: A survey of applications and end-to-end complexities guide is designed to help quantum computing researchers and customers explore how quantum algorithms can be used for their own use cases. It provides a comprehensive overview of the different steps involved and the complexities of using quantum algorithms.

Let KeyCore Help You with Your Quantum Computing Needs

At KeyCore, we are committed to helping our customers make the most of quantum computing opportunities. Our team of experienced professionals can provide a range of services to assist you in harnessing the power of quantum computing, such as consulting, custom development, and managed services. To learn more about how we can help you take advantage of quantum computing, contact us today.

Read the full blog posts from AWS

Official Database Blog of Amazon Web Services

Run Ethereum Nodes on AWS for Digital Records Management

Managed Blockchain and AWS Partners

Amazon Managed Blockchain and many partners of AWS offer a convenient way to use Ethereum nodes without running your own infrastructure. This is ideal for regular blockchain activities, such as running archive nodes or participating in Ethereum staking.

Consolidating Multi-Tenant Cloud Asset Data Store on Amazon Aurora MySQL

VMware Tanzu CloudHealth consolidated a multi-tenant, self-managed, 166-node sharded MySQL databases to Amazon Aurora MySQL-Compatible Edition and Amazon RDS Proxy. This was to support long-term, continuous, multi-factor data growth on their platform while improving reliability and simplifying operations.

Configuring Database Activity Streams for Monitoring in IBM Guardium

To help with monitoring Amazon Aurora PostgreSQL-Compatible Edition database activity streams (DAS), here are the steps to follow for setting up IBM Guardium with version 11.5. Aurora PostgreSQL-Compatible is a fully managed, PostgreSQL-compatible, ACID-compliant relational database engine that combines the speed, reliability, and manageability of Amazon Aurora with the security-enhanced features of PostgreSQL.

Unlocking Security and Efficiency with Blockchain and Smart Contracts

Blockchain technology can revolutionize the management, storage, and sharing of personal digital records, due to features such as immutability, transparency, security, and decentralization. Potential uses for blockchain technology in personal digital records include creating decentralized digital identity, secure records storage, identity thefts prevention, and secure sharing of personal data with third parties.

Implementing Auto-Increment with Amazon DynamoDB

When developing an application with Amazon DynamoDB, sometimes you want new items inserted into a table to have an ever-increasing sequence number. DynamoDB does not provide auto-increment as an out-of-the-box feature, but you can use other features to create this functionality in your application.

KeyCore Can Help

At KeyCore, we provide professional services and managed services to help companies develop and maintain applications using Amazon DynamoDB, Amazon Aurora, and other AWS services. Contact us today to learn more about our offerings and how we can help you leverage blockchain and smart contracts for your applications.

Read the full blog posts from AWS

AWS Cloud Financial Management

Using AWS Cost Explorer for Custom Billing Data Analysis

AWS Cost Explorer is an invaluable tool for analyzing billing data. Its proforma cost support enables end users of AWS resellers and large organizations to better understand and analyze cost at their negotiated rates. With Cost Explorer, users can gain insights on their total spend over time, identify cost drivers, and optimize their spending.

What Can Cost Explorer Do?

Cost Explorer offers a range of features that make it easy to analyze custom billing data. It can show users their total spend over time, allowing them to better control their costs and spot any unexpected increases. Cost Explorer also enables users to compare their costs to other organizations and gain insight on cost drivers. Users can also drill-down into their cost data to find out exactly what is driving their costs and identify opportunities for optimization.

Benefits of Cost Explorer

Cost Explorer offers a range of benefits to end users of AWS resellers and large organizations. It enables users to get visibility into their total spend over time and identify cost drivers. With Cost Explorer, users can also drill-down into their cost data to help them better understand and control their costs. Additionally, users can compare their costs to other organizations to ensure they are remaining competitive.

Keycore and Cost Explorer

At KeyCore, we provide professional and managed services that aim to enable our customers to better control and optimize their AWS spending. We can help you analyze and understand your custom billing data by leveraging the features of AWS Cost Explorer. Our team of AWS experts can also help you monitor your costs and take advantage of any cost-saving opportunities that arise. Contact us today to learn more about how we can help you make the most out of your AWS spending.

Read the full blog posts from AWS

AWS for Games Blog

Developing and Running a Game Platform on AWS

Cloud computing has enabled companies to reinvent themselves, providing customers more value or creating new businesses. The gaming industry is no exception, and for developers and publishers, this means the opportunity to capitalize on the advantages of cloud technology. In this blog, we’ll explore how AWS can help you make the jump from game development to running a game platform.

Authentication and Authorization

Managing authentication and authorization of players for a mobile game presents unique challenges. These challenges are due to the use of mobile devices and the need to secure the various backend systems used by a modern mobile game. AWS offers several services that can help you manage user authentication and authorization.

The first step is Identity and Access Management (IAM). IAM allows you to create users with individual permissions, helping you grant access to the right people. You can also use Amazon Cognito, a service that makes it easy to add authentication to your mobile and web applications. With Cognito, you can easily add sign-up and sign-in features to your game and protect it from unauthorized access.

Amazon Cognito also helps you authenticate players across multiple devices, ensuring that they can access their data on any device. You can also use AWS Single Sign-On (SSO) to securely authenticate players in your game. This allows them to use their existing corporate credentials to securely access the game without having to create a new set of credentials.

AWS for Games at Unreal Fest

Unreal Fest is an annual conference organized by Epic Games, the company behind Unreal Engine. It takes place in New Orleans in October and provides an opportunity for developers, designers, artists and professionals from other industries to gain insights into the latest technologies. AWS for Games will be there to talk about how game developers can use AWS to run their games more efficiently, reduce costs, and increase performance.

At Unreal Fest, AWS for Games will have experts on hand to provide advice on how to get started with AWS and the best practices for running a game platform on the cloud. You’ll also be able to learn about the latest AWS services and how they can help you optimize your game.

KeyCore and AWS for Games

At KeyCore, we are dedicated to helping our customers make the most of their clouds. Our advanced AWS services and managed services can help you take full advantage of AWS for your game platform. We can provide guidance on how to get started, help you in setting up the right architecture for your game platform, and provide assistance with integrating with AWS services.

Whether you’re a game developer or a publisher, our team of AWS experts can help you make the most of AWS for Games. Get in touch to learn more about how we can help you.

Read the full blog posts from AWS

Microsoft Workloads on AWS

Accelerate Amazon EC2 Windows Container Host Launch Time with Fast Launch

In a Microsoft Windows container environment where Amazon EC2 Auto Scaling is resizing cluster capacity, launch times can take up to eight minutes from the time the Amazon EC2 Auto Scaling was triggered to the time the Windows container is responding to traffic. This can create a bottleneck in application scaling operations and limit the availability of your application.

The Fast Launch feature helps to reduce this launch time to less than 60 seconds. Fast Launch works by configuring Auto Scaling to keep a set of pre-warmed Amazon EC2 Windows container hosts ready to respond to traffic. This way, if the Auto Scaling Group needs to scale up, the resources are already started and ready to go.

In order to set up Fast Launch for your Auto Scaling Group, you must first create an Auto Scaling launch configuration. This launch configuration should enable Fast Launch, and specify the minimum and maximum target capacity for the Auto Scaling Group. Once the launch configuration is created, you must configure the Auto Scaling Group to use this launch configuration.

At this point, you need to configure the Auto Scaling Group to keep a set of pre-warmed Windows container hosts ready for responding to traffic. The Fast Launch feature can be configured for the Auto Scaling Group by specifying the target capacity, cooldown period, and health check grace period for the Auto Scaling Group. This will ensure that the Auto Scaling Group is able to keep pre-warmed container hosts ready to respond to traffic.

Once the Fast Launch feature is enabled, the Auto Scaling Group is able to maintain a set of pre-warmed container hosts ready to respond to traffic. This can reduce the launch time for a Windows container host significantly, providing improved application scalability and availability.

At KeyCore, we provide both professional services and managed services to help our customers take advantage of the Fast Launch feature. Our experienced team of AWS experts can help you set up Fast Launch for your Auto Scaling Group, and ensure that you are able to take full advantage of the improved scalability and availability. Contact us today for more information.

Read the full blog posts from AWS

Official Big Data Blog of Amazon Web Services

Simplify Data Transfer from Google BigQuery to Amazon S3 with Amazon AppFlow

Today, it is essential that organizations have the ability to easily move and analyze data across different platforms. Amazon AppFlow, a fully managed data integration service, has been designed to streamline data transfer between AWS services, software-as-a-service (SaaS) applications, and now Google BigQuery. This blog post will explore Amazon AppFlow’s new Google BigQuery connector and discuss how it simplifies the process of transferring data from Google’s data warehouse to Amazon Simple Storage Service (Amazon S3). This can provide significant benefits for data professionals and organizations, such as the democratization of multi-cloud data access.

Define Per-Team Resource Limits for Big Data Workloads with Amazon EMR Serverless

When distributing cloud resources between different teams (e.g. development, testing, production) or line-of-business users, customers often face the challenge of ensuring sufficient resources are consistently available to production workloads and critical teams, while also preventing adhoc jobs from consuming too much of the resources. Amazon EMR Serverless is designed to help customers manage these challenges by allowing them to define per-team resource limits for big data workloads.

Unlock Data Across Organizational Boundaries with Amazon DataZone

Amazon DataZone is now Generally Available. This service enables customers to discover, access, share, and govern data at scale across organizational boundaries, reducing the effort of making data and analytics tools accessible to everyone in the organization. Data engineers, data scientists, and data analysts can now share and access data quickly, making it easier to get value from data faster.

Automate Legacy ETL Conversion to AWS Glue with Cognizant Data and Intelligence Toolkit (CDIT)

Cognizant’s Data & Intelligence Toolkit (CDIT)- ETL Conversion Tool can help customers automate the process of converting legacy ETL code to AWS Glue. This tool supports a range of features that can help reduce the time and effort required to do the conversion, allowing customers to get up and running faster.

Query Big Data with Resilience Using Trino in Amazon EMR with Amazon EC2 Spot Instances

Trino with Amazon EMR provides improved resiliency for running ETL and batch workloads on Spot Instances, enabling customers to save costs. This post showcases the resilience of Amazon EMR with Trino using fault-tolerant configuration to run long-running queries on Spot Instances. The post also demonstrates how AWS Fault Injection Simulator (AWS FIS) can be used to simulate Spot interruptions on Trino worker nodes.

Migrate an Existing Data Lake to a Transactional Data Lake with Apache Iceberg

Data lakes are centralized repositories that can be used to store all structured and unstructured data at any scale. With data lakes, customers have the flexibility to store their data as-is, without having to first structure the data, and then run different types of analytics to get better business insights. This post will discuss how to migrate an existing data lake to a transactional data lake using Apache Iceberg.

Optimize Apache Iceberg to Solve the Small Files Problem in Amazon EMR

Iceberg provides a compaction utility that can compact small files at a table or partition level. However, this approach requires customers to implement the compaction job using their preferred job scheduler or manually triggering the compaction job. This post will discuss a new Iceberg feature that can be used to automatically compact small files while writing data into Iceberg tables using Spark on Amazon EMR or Amazon Athena.

Non-JSON Ingestion with Amazon Kinesis Data Streams, Amazon MSK, and Amazon Redshift Streaming Ingestion

Organizations have to manage an ever-increasing range of data formats, from binary serialization to compact structures like Protobuf. This post will discuss how customers can use Amazon Kinesis Data Streams, Amazon MSK, and Amazon Redshift Streaming Ingestion to ingest non-JSON data streams and derive insights from them.

How KeyCore Can Help

At KeyCore, our team of AWS certified professionals can help customers take advantage of the features discussed in this blog post. Our experts have extensive experience designing, developing, and deploying big data and analytics architectures on AWS. We can help customers better understand the benefits and challenges associated with different solutions, and help them optimize their architecture to get the most value out of their data.

Read the full blog posts from AWS

AWS Compute Blog

Building a Serverless Document Chat with AWS Lambda and Amazon Bedrock

Large language models (LLMs) are increasingly popular in the AI community because of their general-purpose capabilities in natural language processing tasks. From text generation and summarization to translation and analysis, they can be trained on large datasets to develop a broad generalist knowledge base. This blog post will discuss how to use Amazon Bedrock with AWS Lambda to build a serverless document chat application.

What is Amazon Bedrock?

Amazon Bedrock is an open source framework for building natural language processing (NLP) models. It allows developers to quickly create and deploy machine learning models without any prior machine learning or NLP experience. With Amazon Bedrock, developers can easily build custom models or use pre-trained models to power applications.

How Does Amazon Bedrock Work?

Amazon Bedrock works by ingesting natural language data from sources such as text or audio recordings, and then training the model on that data. It uses a specialised algorithm to detect patterns in the data and learns from them. The output of the training is a model that has been optimised to understand and interpret natural language data.

Building a Serverless Document Chat

Using Amazon Bedrock with AWS Lambda, developers can quickly build a serverless document chat application. First, the developer needs to train an Amazon Bedrock model on natural language data. Once the model is trained, they can deploy it to AWS Lambda and create a serverless function. This serverless function will then be responsible for handling all incoming requests from users. The user’s request is then passed to the Amazon Bedrock model, which will process it and generate a response. This response is then sent back to the user.

Using KeyCore to Help Build a Serverless Document Chat

At KeyCore, we offer both professional services and managed services to help customers build serverless document chat applications using Amazon Bedrock and AWS Lambda. We can provide training and guidance on how to use Amazon Bedrock to build custom models, as well as provide support in deploying and managing the serverless function. Our team of AWS experts can also help with debugging and optimising performance to ensure that the application runs smoothly and meets customer needs.

Read the full blog posts from AWS

AWS for M&E Blog

Secure Media Streaming with Private Networking using AWS Media Services

Many organizations are looking for ways to keep their live media streams secure, while still allowing controlled access. With Amazon Web Services (AWS) Media Services, organizations now have the opportunity to do just that. AWS Media Services can be used to store media assets in Amazon Simple Storage Service (S3), which can be securely accessed and delivered through private networking.

Additionally, there are third-party solutions that can help offload camera footage directly to S3, such as OffShoot Pro by Hedge. OffShoot Pro is an asset management and cloud offloading tool designed to streamline the media back up process. It is integrated with the AWS SDK for JavaScript v3, allowing for near-real-time media uploads to S3, and offers support for a variety of industry-standard cameras.

Finally, content creators can make ad-supported streaming more accessible to content owners through TVCoins. TVCoins is a technology startup that leverages AWS to break down the barriers associated with launching FAST (Free Ad-Supported Streaming Television). TVCoins helps content owners reduce the cost and time associated with getting their content up and running.

At KeyCore, we provide professional services and managed services to help organizations take full advantage of the benefits of AWS Media Services. Our solutions can help you deliver secure, private streams, reduce time to market, and maximize the efficiency of content production. Contact us today to learn more about our offerings and how we can help you benefit from the power of AWS Media Services.

Read the full blog posts from AWS

AWS Developer Tools Blog

Important Update to AWS SDK for Java v2, AWS SDK for .NET v3, and AWS Tools for PowerShell

On September 20, 2023, AWS released a change in parameter type for the S3 GetObjectAttributes API which could require a type definition change in code used by customers to access the API. This change affects those using the AWS SDK for Java v2.x, AWS SDK for .NET v3.x, and AWS Tools for PowerShell v4.x.

What is S3 GetObjectAttributes API?

The AWS S3 GetObjectAttributes API is used to retrieve information about a specified object stored in S3. This information includes the object’s size, upload date, and storage class. It is a powerful tool for developers working with AWS S3 and can be used to retrieve important information about objects stored in S3.

What is the Change in Parameter Type for the S3 GetObjectAttributes API?

The change in parameter type for the S3 GetObjectAttributes API affects customers using the AWS SDK for Java v2.x, AWS SDK for .NET v3.x, or AWS Tools for PowerShell v4.x. The change requires customers to modify their code to accommodate the updated parameter type. The change was implemented to improve functionality and accuracy of the API.

How Does the Change Affect Me?

If you are using the AWS SDK for Java v2.x, AWS SDK for .NET v3.x, or AWS Tools for PowerShell v4.x to access the S3 GetObjectAttributes API, you may need to modify your code to accommodate the new parameter type. Failing to update your code may cause unexpected behavior or errors when attempting to access the API.

KeyCore Can Help

At KeyCore, our AWS Certified team of professionals can assist you with updating your code to accommodate the new parameter type for the S3 GetObjectAttributes API. We provide both professional services and managed services for AWS, and our team is highly experienced in helping customers get the most out of their AWS environments. To find out more, visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Architecture Blog

Important Improvements to the AWS Well-Architected Framework Guidance

The AWS team is proud to share updates to the AWS Well-Architected Framework guidance. This update pays special attention to the six pillars of the framework: Operational Excellence, Security, Reliability, Performance Efficiency, Cost Optimization, and Sustainability.

More Prescriptive Guidance and Recommendations

The new and updated best practices in the AWS Well-Architected Framework have been made more prescriptive. This means that the guidance now includes enhanced recommendations and step-by-step instructions on how to implement reusable architecture patterns, use AWS services, and design solutions to achieve your business requirements.

Use AWS Services to Reach Your Goals Faster

AWS Well-Architected Framework guidance helps customers reduce risk and make better decisions as they move to the cloud. When paired with the right AWS services, the guidance helps customers reach their goals faster. AWS services such as Amazon EC2, Amazon S3, Amazon RDS, and Amazon VPC are just a few of the services that customers can use to build and deploy workloads in the cloud with confidence.

KeyCore Can Help with AWS Adoption

KeyCore provides professional and managed services to customers that are looking to adopt AWS. We are highly advanced in AWS and can provide technical details, references to AWS documentation, and code snippets in CloudFormation YAML and AWS API Calls using Typescript and AWS SDK for JavaScript v3. Let us help you bring your applications to the cloud and leverage the power of AWS.

Read the full blog posts from AWS

AWS Partner Network (APN) Blog

Unlock the Potential of Smart Factories with Infosys ConnectedOps on AWS

Industries are rapidly evolving, placing a strong emphasis on agility, flexibility, sustainability, and cost-effectiveness. ConnectedOps on AWS is a vital solution from Infosys Cobalt that effectively tackles industry-specific challenges. It stands on four key pillars: digital product execution, digital industrial asset management, digital workforce, and digital sustainability. The solution leverages AWS IoT services and the AWS Industrial Data Fabric.

Revolutionizing SAP Payment Reconciliation with EPI-USE ERP PAY on AWS

EPI-USE ERP PAY provides end-to-end automated reconciliation to SAP with a consolidated view on all external digital payment touchpoints in the organization. EPI-USE ERP Pay is designed and built by EPI-USE Payment Services to address challenges for SAP clients without the need to change Payment Service Providers or introduce additional software in the client landscape. Additionally, the AWS SDK for SAP ABAP supports SAP NetWeaver ABAP version 7.4 and above.

Amplifying Business Process Automations with UiPath and Amazon SageMaker

Organizations are turning to intelligent automation technologies to streamline their business processes and improve efficiency. UiPath Business Automation Platform and Amazon SageMaker can be integrated to help businesses automate complex processes, improve decision making, and drive innovation by leveraging the power of AI. The solution allows customers to bring machine learning inference from SageMaker directly into their business automation.

Learn About Modernizing on AWS Using Prescribed Pathways in New Partner Training Series

Customers are continuing their cloud adoption journey by modernizing and optimizing workloads to reap benefits by leveraging AWS managed services. AWS built new training courses and learning journeys exclusively for AWS Partners. The digital course “Using Modernization Pathways to Modernize on AWS” is accessible via AWS Skill Builder and provides an overview of all six modernization pathways and where they fit in the migration landscape.

Driving Supply Chain Operations at Speed with IBM Consulting Supply Chain Ensemble on AWS

IBM Consulting and AWS have collaborated to co-create a supply chain framework after analyzing and refining 100+ use cases, focusing on risk identification and management of exceptions. IBM Consulting Supply Chain Ensemble on AWS helps customers overcome common challenges and deliver efficiency and optimization within their supply chains. IBM is an AWS Premier Tier Services Partner that helps customers harness the power of innovation and drive their business transformation.

Implementing an Operational Data Mesh with Palantir Foundry on AWS to Transform Your Organization

Data architectures and strategies are responding to the need for discoverability and consumer desire to directly connect with producers. Data mesh is one such approach and provides a methodology for how organizations can organize around data domains by delivering data as a product. Palantir Foundry runs on AWS to help customers deliver and transform their data architectures through such an approach while leveraging and building on existing investments.

Unlocking Innovation: A Closer Look at Deloitte’s Generative AI Solutions on AWS with Amazon Bedrock

Amazon and Deloitte have a combined 40 years of experience in the AI space, and continue to innovate and use AI technologies to change how businesses work, grow, and thrive. Amazon Bedrock is the easiest way to build and scale with foundation models (FMs), and in this post, we’ll share how Deloitte can help customers benefit from AWS’s generative AI offerings.

How DXC Helped a Customer Transform its Monolithic Application into Microservices on AWS

Outdated technology can cause numerous issues, and it isn’t easy to find viable solutions that can integrate with or are cost-effective for legacy applications. Modernizing legacy systems is the right solution for engaging with and attracting new customers. DXC Technology is migrating monolithic Java applications from on-premises to AWS, providing customers with an automated, cloud-based solution.

Announcing the 2023 AWS Partner Award Winners in India

The regional 2023 AWS Partner Awards India ceremony was held in Goa, celebrating partners that have achieved successful customer outcomes by leveraging the breadth and depth of AWS. Please join us in congratulating these AWS Partners for their success!

Malware Scanning for Regulated Workloads on AWS with Cloud Storage Security

Many of the requirements for meeting and maintaining a secure environment can be met by using AWS FedRAMP-authorized regions and services. Antivirus for Amazon S3 by Cloud Storage Security can be used to automate malware scanning for application workflows or data ingestion pipelines to achieve data security and compliance.

Accelerating Time-to-Compliance in HCLS Through Automated FDA Forms Processing with AI on AWS

In the healthcare and life sciences (HCLS) sector, companies must comply with regulatory statutes. Provectus aided PSC Biotech in integrating AI/ML and automation into the document processing pipeline, a crucial step towards delivering the highest standard of regulatory compliance services.

State Farm Increases Efficiency and Optimization by Integrating Control-M with AWS Mainframe Modernization Service

State Farm is leading the way in modernizing its mainframe application by embracing the cloud and adopting robust workload automation tools. This modernization journey highlights the transformative power of cloud computing and intelligent workload automation in today’s competitive business landscape.

Automating the Know Your Customer Process Using Capgemini’s AI-Powered Solution on AWS

Financial institutions use “Know Your Customer” (KYC) to identify and verify a customer’s identity prior to providing any financial service. Capgemini’s KYC solution helps institutions automate identity documents validation, extraction of information present in them, and forgery detection using AI. It provides customers an extensible automated solution for validating government-issued documents while reducing the overall time and manual intervention required to onboard customers.

KeyCore Services

At KeyCore, we provide professional and managed services to help our clients with their AWS modernizations. Our team of cloud experts has over a decade of experience with AWS and can help you reap the benefits of cloud optimization and automation. We use a combination of AWS best practices, cost optimization strategies, automation, and more to help our customers achieve their modernization goals. Contact us today to get started!

Read the full blog posts from AWS

AWS HPC Blog

Designing with AI and CFD Simulations on AWS

Rapidly exploring new design concepts in automotive and aerospace is possible through combining generative AI with conventional physics-based CFD simulations on AWS. In this post, we’ll show how this process works and how KeyCore can help.

Exploring Design Concepts

With generative AI, it’s possible to create a design process where a single image can be used to explore and visualize possible design concepts. This can be used in automotive and aerospace industries to quickly and easily create and iterate on new ideas. By combining generative AI with physics-based CFD simulations on AWS, we can take these design concepts and validate them, resulting in more accurate and reliable designs.

AI and CFD Simulations on AWS

AI and CFD simulations can be used together to create a powerful design process. AI can create a huge number of potential designs, which can then be validated using conventional physics-based CFD simulations. This allows for rapid exploration and validation of new design concepts. By running the simulations on AWS, you can quickly and easily scale up the computing power needed to process the simulations.

KeyCore and AWS

At KeyCore, we specialize in providing professional and managed services on AWS. With our deep understanding of AWS, we can provide the expertise necessary to set up and run AI and CFD simulations on AWS. We can help our clients with the planning, design, and implementation of a secure and reliable AWS platform that is optimized for running AI and CFD simulations.

Read the full blog posts from AWS

AWS Cloud Operations & Migrations Blog

Unlock the Power of Automation with AWS CloudOps

Setup of AWS Application Migration Service and Elastic Disaster Recovery

AWS Application Migration Service (AWS MGN) is an automated lift-and-shift service that simplifies server migrations to AWS at scale. This makes it easier for customers to move from their current infrastructure to the cloud. AWS Elastic Disaster Recovery (AWS DRS) helps increase the resilience of both on-premises and cloud-based applications by replicating data to AWS. To use either AWS MGN or AWS DRS, customers must complete manual setup tasks. KeyCore can help with this setup process, ensuring that customers are successful in their migration projects.

Transitioning from Migration to Modernization on the Cloud

Migrating to the cloud is the first step in modernizing an IT landscape. Once the migration is complete, businesses can benefit from a more agile, secure, and modern environment. However, some organizations may experience a slowdown in the momentum that was built during migration, leading to a stall in the process. KeyCore can help businesses overcome these challenges and unlock the potential of the cloud with AWS Mainframe Modernization.

Strategizing Mainframe Scheduler Migration to AWS

Mainframe environments often involve complex batch processing tasks. As mainframe applications are migrated to AWS using AWS Mainframe Modernization service, similar batch processing capabilities are needed. This blog will explain the approaches and patterns for selecting and migrating a mainframe job scheduler to AWS. KeyCore can assist with this process, ensuring a successful mainframe migration.

Use Lambda-backed Custom Resources to Reduce Overhead in a Multi-Account Environment

AWS CloudFormation simplifies the process of provisioning AWS and third-party resources. However, some workloads require custom logic or inputs beyond standard parameter values. Lambda-backed custom resources are an often overlooked and useful CloudFormation feature. With this feature, businesses can take advantage of the power of automation to reduce overhead in a multi-account environment. KeyCore’s expert cloud consultants can help customers use this feature to streamline their operations.

Scale Workload Reviews with the New Review Templates Feature in the AWS Well-Architected Tool

The AWS Well-Architected Tool provides a way to define and review workloads based on the latest AWS architectural best practices. This helps customers identify areas of strength and improvement in their workloads. With the new review templates feature, businesses can easily scale their workload reviews and receive an improvement plan detailing any high or medium risk issues. KeyCore can provide the guidance and expertise necessary to optimize workloads and get the most out of the AWS Well-Architected Tool.

Migrate a WordPress Blog from Azure to AWS Using AWS Application Migration Service

Businesses are constantly looking for ways to optimize their costs and leverage the benefits of cloud computing. As such, many organizations need to migrate their virtual machines from one cloud provider to another. For example, moving from Azure or Google Cloud Platform (GCP) to AWS. AWS Application Migration Service simplifies the process of migrating WordPress blogs from Azure to AWS. KeyCore can provide the expertise to make sure that the migration is successful.

Streamline AWS Application Migration Service Replication Agent Deployment Using MGN Connector

AWS Application Migration Service (AWS MGN) is the recommended service for migrations to AWS. It makes it easier to migrate source servers from different platforms to AWS. Over the past year, AWS has introduced several new features to MGN, designed to help customers with their migration projects. KeyCore can help customers take advantage of these features, streamlining their AWS migration and ensuring a successful outcome.

Learn How to Design Landing Zone Architectures with New AWS Control Tower Training

Designing and building a landing zone is a crucial step in the migration journey to the AWS cloud. A well-architected landing zone helps accelerate migration and simplifies the management and governance of cloud resources. To help customers reach their cloud governance objectives, AWS has introduced various solutions. KeyCore can provide the expertise necessary to help customers design and build a secure and agile landing zone architecture.

Read the full blog posts from AWS

AWS for Industries

AWS for Industries

Executive Conversations: Generative AI for responsible innovation in pharma commercialization

EVERSANA is a leading provider of commercial services to the life sciences industry, building generative artificial intelligence (AI) applications to improve patient outcomes and business value. Scott Synder, Chief Digital Officer at EVERSANA recently spoke with Ujjwal Ratan, Data Science and ML Leader for Healthcare and AWS, about the implementation of AI in the pharmaceutical industry.
Ujjwal explained that the goal of generative AI is to enable responsible innovation, and to do this, the most important factor is to recognize the data is sensitive and needs to be safeguarded. This is an area where AWS excels; its suite of security and compliance services are comprehensive, robust, and easy to use. Additionally, by using AWS security features, customers can be confident that their data is secure and compliant with any regulatory requirements.

AWS and Children’s Brain Tumor Network: Powering multi-modal data sharing for pediatric brain cancer research

Children’s Brain Tumor Network (CBTN) is leading the fight against pediatric brain cancer with a vision to enable personalized treatments tailored to each patient’s individual needs. They are leveraging AWS to power a cloud-based platform that enables the sharing of diverse data sets in real-time, from multiple modalities, such as MRI, CT, and genomic data.
AWS was chosen because of its scalability and reliability, allowing for the secure storage and processing of large volumes of data. AWS also boasts an industry-leading suite of services that offer data encryption, data governance, and compliance with various regulations. This means that CBTN is able to improve the accuracy and speed of diagnosis, while at the same time, ensuring data security and regulatory compliance.

Using AWS generative AI to improve defect detection in Manufacturing

Product quality control and surface defect detection are critical to reducing product cost and ensuring customer satisfaction. While human inspection can’t scale, computer vision models need vast amounts of data to detect surface area defects. To meet this need, AWS provides a suite of services that offer scalability and reliability and can be adapted to fit any manufacturing environment. Customers can leverage AWS to develop and deploy models that are more accurate and efficient, while at the same time, ensuring data privacy and regulatory compliance.

Esko enables increased speed-to-market and quality for Life Sciences companies

Life Sciences companies need flexible solutions to streamline their processes and meet changing customer demands. Esko is helping to meet these needs through R&D and partnerships with key Life Sciences customers. By leveraging AWS, Esko has developed a suite of services that enable Life Sciences companies to accelerate product launches, reduce time-to-market, and ensure product quality. With AWS, customers can benefit from scalability and reliability, as well as robust security measures that protect their data.

Financial Services Spotlight: Featuring Amazon DocumentDB

In this edition of the Financial Services Industry (FSI) Services Spotlight monthly blog series, we highlight five key considerations for customers who use Amazon DocumentDB, including achieving compliance, data protection, isolation of compute environments, audits with APIs, and access control/security. We provide specific guidance, suggested reference architectures, and technical code teams can adopt to facilitate service approval and get up and running quickly.

AWS delivers unmatched scale to enable MACH solutions

Retailers are struggling to keep up with customer demands, but the MACH Alliance has an alternative approach which gives them the agility to innovate. MACH stands for an open, composable, and future-proof architecture that’s enabled by AWS. Customers can benefit from AWS’s unparalleled scalability and performance, as well as its robust security features that protect customer data.

Financial Services Spotlight – Amazon Managed Service for Apache Flink

This edition of the Financial Services Industry (FSI) Services Spotlight monthly blog series focuses on Amazon Managed Service for Apache Flink. We highlight five key considerations for customers who process and analyze streaming data, such as achieving compliance, data protection, isolation of compute environments, audits with APIs, and access control/security. Specific guidance, suggested reference architectures, and technical code teams can adopt to facilitate service approval and get up and running quickly are provided.

How Databricks on AWS helps optimize real-time bidding using machine learning

Real-time Bidding (RTB) faces challenges such as transparency and ad fraud risks. Databricks on AWS offers a solution with the Databricks Real-time Bidding Accelerator, which uses machine learning and predictive analytics to optimize RTB strategies. Customers can benefit from the scalability and reliability of AWS, as well as the suite of security and compliance services that protect customer data.

At KeyCore, we are well-versed in the use of AWS for industries. Our team of AWS certified experts can help you develop and deploy the right solution for your organization, while ensuring the highest degree of security and compliance. Contact us today to learn more.

Read the full blog posts from AWS

AWS Messaging & Targeting Blog

How To Implement Multi-Tenancy with Amazon Pinpoint and Get Value out of Your DMARC Policy

Businesses today are complex entities managing multiple product lines, customer segments, and even geographical locations. For Independent Software Vendors (ISVs) in the Business-to-Business (B2B) space, offering marketing automation solutions to their customers is key to success. To accomplish this, they need a customer engagement strategy that can easily adapt and scale to new requirements.

Using Amazon Pinpoint for Multi-Tenancy

Amazon Pinpoint is the perfect solution for enabling multi-tenancy within a single customer engagement strategy. It allows ISVs to manage multiple customer segments within a single Amazon Pinpoint project, each with its own set of users, applications, campaigns, and analytics. Furthermore, Amazon Pinpoint provides custom metrics and segmentation features, allowing ISVs to tailor their customer engagement strategies to individual customer segments.

Enabling Email Authentication with Amazon SES

To enhance the security and trustworthiness of their customer’s emails, ISVs can use Amazon Simple Email Service (Amazon SES). Amazon SES provides powerful email authentication capabilities, allowing ISVs to establish trust with their customers and their customers’ customers. Furthermore, Amazon SES allows ISVs to get maximum value out of their Domain-based Message Authentication, Reporting and Conformance (DMARC) policy.

How KeyCore Can Help

At KeyCore, we provide professional and managed services for Amazon Pinpoint and Amazon SES. Our team of AWS experts can help ISVs create and manage their customer engagement strategies on Amazon Pinpoint, and provide support for email authentication and DMARC policies on Amazon SES. To learn more about our services, please visit our website.

Read the full blog posts from AWS

The latest AWS security, identity, and compliance launches, announcements, and how-to posts.

The Latest AWS Security, Identity, and Compliance Launches

AWS-LC is Now FIPS 140-3 Certified

AWS Cryptography has announced that the National Institute for Standards and Technology (NIST) has awarded AWS-LC a validation certificate as a Federal Information Processing Standards (FIPS) 140-3, level 1, cryptographic module. This important milestone enables AWS customers that require FIPS-validated cryptography to leverage AWS-LC as a fully owned AWS implementation. AWS-LC is an important part of AWS Cryptography’s suite of cryptographic solutions and is designed to help customers meet their cryptographic needs in a secure and cost-effective manner.

Use AWS Secrets Manager to Store and Manage Secrets in On-Premises or Multicloud Workloads

AWS Secrets Manager helps customers securely store and manage secrets, such as database credentials, API keys, and other secrets, throughout their lifecycles. Customers can use Secrets Manager to store and manage secrets in applications that are hosted in the cloud, as well as in on-premises or multicloud environments. Secrets Manager also allows customers to rotate secrets automatically, ensuring they are not left exposed.

Enable Security Hub Partner Integrations Across Your Organization

AWS Security Hub offers over 75 third-party partner product integrations that customers can use to send, receive, or update findings in Security Hub. It is recommended that customers enable their corresponding Security Hub third-party partner product integrations when using partner solutions. Centralizing security and compliance data in Security Hub from all sources helps customers better evaluate their security posture.

Validate IAM Policies with Access Analyzer Using AWS Config Rules

Customers can use AWS Identity and Access Management (IAM) Access Analyzer IAM policy validation to validate IAM policies against IAM policy grammar and best practices. Access Analyzer policy validation findings provide actionable recommendations to help customers author policies that are functional and conform to AWS security best practices.

How to Use AWS Certificate Manager to Enforce Certificate Issuance Controls

AWS Certificate Manager (ACM) allows customers to provision, manage, and deploy public and private Transport Layer Security (TLS) certificates for use with AWS services and internal connected resources. Customers likely have many users, applications, or accounts that request and use TLS certificates as part of their public key infrastructure (PKI). ACM Certificate Authorizations is a feature that enables customers to manage certificate issuance and provides additional control over certificate requests.

Secure by Design: AWS to Enhance MFA Requirements in 2024

AWS is further strengthening the default security posture of customers’ environments by requiring the use of multi-factor authentication (MFA) for the most privileged users in their accounts, beginning in 2024. MFA is one of the simplest and most effective methods for enhancing security and access control for AWS accounts. KeyCore can help customers prepare for and enable MFA in their organizations, as well as educate users on best practices for securely using MFA.

Read the full blog posts from AWS

AWS Startups Blog

AWS Startups Women’s Demo Week

AWS Startups is excited to announce the launch of a new global movement: AWS Startups Women’s Demo Week. Every year from November 6th to 10th, the event will bring together female entrepreneurs from all over the world to showcase their projects and share their innovative ideas.

Women in Tech

Women in tech have faced many barriers to success, but there is now a growing interest in women-led initiatives and a renewed focus on creating a more equitable and diverse tech landscape. AWS Startups Women’s Demo Week is a unique opportunity to celebrate female leadership and recognize the accomplishments of women in the field.

What to Expect

During the week, AWS Startups Women’s Demo Week will feature a variety of activities, from panel discussions to networking events and hackathons. Attendees will have the chance to meet and connect with other female innovators and entrepreneurs, and get inspired by their presentations and ideas.

How to Participate

Anyone interested in attending AWS Startups Women’s Demo Week can sign up online. The event is free of charge, and the organizers are offering travel grants to help cover the costs of attending. All attendees must complete an application and submit it by October 21, 2017 in order to be considered for the event.

KeyCore Services

At KeyCore, we understand the importance of diversity and are committed to helping our clients foster an inclusive environment. Our team of experienced AWS consultants can help you make the most of AWS Startups Women’s Demo Week. From setting up the infrastructure to managing the event, our experts will work with you to make sure you get the most out of the experience. Contact us today to learn more about how we can help.

Read the full blog posts from AWS

Front-End Web & Mobile

Modern Tooling for Your Website using AWS, Contentful & Next.js

In this tutorial, we’ll look at building a blog website from beginning to end that has an A-level Lighthouse performance scale and low maintenance friction. To do this, we’ll store our content using a Contentful model, access the content from a Next.js app, and deploy the Next.js app on AWS.

Creating an API with Amazon Location Service

Geospatial applications, like interactive maps and location-based services, have become an essential part of our daily lives. As their demand increases, developers need powerful and secure tools to build reliable geospatial solutions. AWS is at the forefront of providing cutting-edge services, and the recent launch of Amazon Location Service (ALS) allows developers to easily create secure and reliable APIs.

Deploying a GraphQL API with AWS Amplify

The official AWS Cloud Development Kit (CDK) construct for Amplify’s GraphQL APIs capabilities has recently been released. Using the construct, developers can create a real-time GraphQL API backed by data sources such as Amazon DynamoDB tables or AWS Lambda functions with a single GraphQL schema definition.

Using Amplify, Contentful & Next.js with AWS

Using the tools mentioned above, developers can take advantage of AWS to create secure, reliable, and performant web applications. Contentful provides an easy-to-use approach for storing and managing content, Next.js is a powerful framework for creating React applications, and Amplify & AWS can be used together to create a GraphQL API and data stack. By using all of these services together, developers can create an A-level Lighthouse performance scale and low maintenance friction website.

How KeyCore Can Help

The tools and services mentioned in this tutorial can be difficult to manage and configure, and the potential for costly mistakes is high. KeyCore can help you get the most out of AWS, and our professional and managed services can provide the expertise to help make sure your web applications are secure, reliable, and performant. Contact us today to learn more about how we can help you get the most out of your web applications.

Read the full blog posts from AWS

Innovating in the Public Sector

Innovating in the Public Sector

The public sector has a variety of complex and demanding needs in terms of security, compliance, and modernization, and AWS is the perfect platform for public sector innovators to build secure, cost-effective, reliable, and highly-performant applications.

Continued Innovation in CJIS Compliance

Justice and public safety agencies and their solution providers have been building highly available, resilient, and secure applications on AWS at a rapid pace. AWS’s innovative features and security controls can help customers comply with the latest Federal Bureau of Investigation (FBI) Criminal Justice Information Services (CJIS) Security Policy updates, and align with CJIS compliance not only in AWS GovCloud (US), but also in AWS (US) Commercial regions. Customers have the confidence to deploy CJIS workloads in either AWS (US) Region, while also having access to simple and powerful cloud native tools to manage the full lifecycle of sensitive data.

Re-Imagining the Future of Mobility on Islands

Islands have unique transportation needs, and are often highly dependent on external transport linkages. Technologies such as digital twins, artificial intelligence (AI), edge and cloud computing, and open data can help islands address these challenges. Furthermore, by leveraging AWS services, public sector customers can modernize their applications and build secure, cost-effective, reliable, and highly-performant solutions.

Building High-Throughput Satellite Data Downlink Architectures with AWS Ground Station WideBand DigIF and Amphinicy Blink SDR

Cloud-based ground segment architectures provide users with a range of benefits, including the ability to transport and deliver Wideband Digital Intermediate Frequency (DigIF) data with AWS Ground Station, plus build a proof-of-concept using software-defined radio Blink created by AWS Partner Amphinicy. This allows customers to quickly move existing workloads to the cloud, while also managing and governing a multi-account environment.

The Power of Collaboration Accelerates Public Sector Transformation in Iceland

Digital Iceland’s decision to work with partners to access knowledge and capabilities enabled them to digitize 20 services in just over three years. This is a testament to the power of collaboration, and shows how partnering with AWS can help the public sector transform their services.

Landing Zone Accelerator Connectivity with VMware Cloud on AWS

The Landing Zone Accelerator on AWS (LZA) solution deploys a cloud foundation with best practices and global compliance frameworks. Customers with highly-regulated workloads can use the LZA to better manage and govern their multi-account environment and integrate on-premises vSphere environments to move existing workloads to the cloud.

Migrate and Modernize Public Sector Applications Using Containers and Serverless

Containerization and serverless can help public sector customers migrate and modernize their applications. AWS services like AWS Lambda, Amazon Elastic Kubernetes Service (Amazon EKS), Amazon Elastic Container Service (Amazon ECS) can be used to build modern applications for diverse use cases, including those driven by machine learning (ML) and generative artificial intelligence (AI).

Building Smart Infrastructure: Using AWS Services for Digital Twins

Digital twins are an important tool for building smart infrastructure, and AWS services can help to create and maintain them. This blog post provides a use case for digital twins and an open-source digital twin sample front-end application built with AWS Amplify, Amazon Cognito, and AWS IoT Core that customers can leverage as a starting point for building efficient, scalable, and secure digital twin solutions.

At KeyCore, our AWS-certified consultants are experienced in leveraging AWS services to help public sector customers innovate and modernize their applications. Our cloud experts can help you develop secure, cost-effective, reliable, and highly-performant solutions built with the latest technologies, from containers and serverless to digital twins. We can assist you throughout the entire process, from strategy to implementation and beyond. Contact us today to learn more about how KeyCore can help you innovate in the public sector.

Read the full blog posts from AWS

The Internet of Things on AWS – Official Blog

Using NVIDIA Omniverse to Prepare 3D Assets for AWS IoT TwinMaker

The manufacturing and architecture, engineering, construction, and operations (AECO) industries have adopted building information modeling (BIM) software to generate accurate 3D models for use in digital twins. These 3D models can range from factory floors to construction sites or office buildings.

However, exporting 3D models from BIM software can be challenging, and the resulting models are often incompatible with the web-based 3D applications needed to deploy the digital twins on AWS. NVIDIA Omniverse is a new platform that has changed the way 3D assets can be prepared for AWS IoT TwinMaker.

What Is NVIDIA Omniverse?

NVIDIA Omniverse is a comprehensive 3D engineering platform that enables 3D asset preparation, real-time collaboration, and multi-user online development. It has powerful tools for importing 3D models from BIM software, and can quickly convert them into a format compatible with AWS IoT TwinMaker.

It also helps users create 3D assets with robust physics simulations through NVIDIA PhysX and GPU-accelerated rendering with NVIDIA RTX. This allows engineers to quickly and accurately test, optimize, and validate 3D assets before deploying them in AWS.

What Are the Benefits of NVIDIA Omniverse?

Using NVIDIA Omniverse with AWS IoT TwinMaker allows users to benefit from a higher degree of automation. This reduces the time it takes to deploy a digital twin, and facilitates a more efficient work process.

NVIDIA Omniverse also provides users with a rich library of existing assets and tools that can be used to quickly create 3D models. This helps users save time and money, while optimizing the quality of their digital twins.

Lastly, NVIDIA Omniverse simplifies collaboration by allowing multiple users to work on the same 3D model concurrently. This helps teams quickly build and deploy digital twins, and keep up with the needs of their customers.

How Can KeyCore Help?

At KeyCore, we understand the importance of efficient digital twin deployment. We provide both professional services and managed services related to NVIDIA Omniverse and AWS IoT TwinMaker. Our highly experienced engineers can help you quickly create and deploy 3D assets for your digital twins.

Our team can also help you optimize and validate your 3D models with physics simulations, and provide technical support and advice as you incorporate NVIDIA technology into your workflow.

With KeyCore, you can rest assured that your digital twins are created with the highest quality standards. Contact us today to learn more about how we can help you build and deploy better digital twins with NVIDIA Omniverse and AWS IoT TwinMaker.

Read the full blog posts from AWS

AWS Open Source Blog

How SIXT and AWS Collaborate Using Crossplane on EKS and Open Source Databases

SIXT and Crossplane on EKS

SIXT is a leading global mobility provider that serves customers in over 100 countries. To keep up with the needs of its customers, the company is committed to creating an exciting customer journey by delivering premium service and products quickly. To make that possible, they modernized their applications as microservices.

To ensure that their applications are running smoothly, they are using Crossplane on Amazon Elastic Kubernetes Service (EKS), an open source multi-cloud control plane. Crossplane allows them to easily provision, configure, and manage cloud native services. At the same time, SIXT is able to unify and standardize their infrastructure and operations across multiple cloud providers, reducing operational complexity and cost.

Behind the Scenes on AWS Contributions to Open Source Databases

Amazon Web Services (AWS) engineers are significant contributors to the open source databases on which their managed services are built and used by customers. Aurora PostgreSQL and MySQL-compatible editions, as well as Amazon Relational Database Service (Amazon RDS) for PostgreSQL, MySQL, and MariaDB, are AWS services that are built on or compatible with open source databases.

AWS engineers have already contributed to these open source databases with performance improvements, bug fixes, and features. By continuing to build on open source databases, AWS is able to offer their customers the latest advancements and applied best practices.

Cloud Native Operational Excellence (CNOE)

AWS, SIXT, and other companies have come together in an effort to share the best tools and internal developer platform (IDP) practices. This joint project, Cloud Native Operational Excellence (CNOE), is aimed at improving the IDP development process. CNOE members will contribute tooling, plugins, and reference implementations that help teams build their internal developer platforms.

AWS contributions to open source databases and CNOE represent the company’s dedication to providing customers with the best tools and practices to develop and maintain their applications. With the help of AWS and SIXT, developers can now have a smoother and faster experience when building their applications and IDPs. At KeyCore, we are committed to helping companies stay up to date with the latest cloud-native technologies. Our team of experienced AWS engineers can help you get the most out of your cloud-native project. Contact us today to learn more.

Read the full blog posts from AWS

Scroll to Top