Summary of AWS blogs for the week of monday Mon Aug 07

In the week of Mon Aug 07 2023 AWS published 80 blog posts – here is an overview of what happened.

Topics Covered

Desktop and Application Streaming

Migrating from Amazon WorkSpaces Services to Microsoft 365

Amazon WorkSpaces services can be used to run office productivity applications. Currently, customers can purchase Microsoft Office as part of a WorkSpaces application bundle. However, beginning August 1, 2023, customers can bring Microsoft 365 Apps for Enterprise licenses to use on Amazon WorkSpaces services. Microsoft 365 adds to the power of WorkSpaces services by incorporating popular services such as Exchange, Skype, SharePoint, and Teams into the WorkSpaces services.

What does Microsoft 365 Offer?

Microsoft 365 offers a wide variety of features to enhance the use of WorkSpaces services. These features include Exchange, Skype, SharePoint, and Teams. Exchange enables customers to access their emails, calendars, and contacts from any device. Skype keeps customers connected with video conferencing, phone and messaging. SharePoint provides secure file storage and collaboration capabilities. And Teams provides chat and collaboration capabilities for customers to work together.

Additional Benefits of Microsoft 365

In addition to the features provided by Microsoft 365, customers will benefit from improved manageability and security. Customers will be able to manage their environment more effectively with remote delivery and fast provisioning. Microsoft 365 also provides enhanced security features such as Multi-Factor Authentication, Data Loss Prevention, and Mobile Device Management.

How can KeyCore help?

At KeyCore, we provide both professional services and managed services. Our team of experts can help customers with the migration of Amazon WorkSpaces services from Microsoft Office included bundles to Microsoft 365 Apps for Enterprise. We can help customers assess their current environment and identify the best approach for migrating to Microsoft 365 Apps for Enterprise. We are also able to provide advice and guidance to ensure customers get the most out of their WorkSpaces services.

Read the full blog posts from AWS

AWS for SAP

Simplify SAP Backups with AWS Services

SAP customers running workloads on AWS should have a solid backup strategy to safeguard their database and application non-database file systems. Backups are critical for recovering systems in case of data loss. AWS customers have multiple third-party, database-native, and AWS-native options for backup strategies.

Third-Party Backup Options

Third-party options provide a wealth of features and customization with their offerings. These solutions often integrate with SAP HANA and can offer features such as APIs, data seeding, database-specific features such as flashback, and cloud storage integrations.

Database-Native Options

Database-native options are the simplest backup solution. The SAP HANA database engine provides the ability to backup and restore data natively. This is a good option for customers wanting a simple, easy-to-implement solution.

AWS-Native Options

AWS-native options provide a lot of automation and flexibility. AWS native backup solutions can be scripted with AWS CLI and AWS CloudFormation for automation. With this option, customers can take advantage of AWS services such as AWS Backup and Amazon S3 for storage.

KeyCore Can Help

At KeyCore, our team of AWS experts provides professional services and managed services specifically tailored to customers running SAP on AWS. Our team can help you to identify the most suitable backup strategy for your situation, including setting up the necessary automation. To learn more about our offerings, please visit our website KeyCore.

Read the full blog posts from AWS

Official Machine Learning Blog of Amazon Web Services

Amazon Web Services’ Machine Learning blogs

Amazon Translate Enhances Its Custom Terminology

Amazon Translate is a neural machine translation service that enables quick, high-quality, and affordable language translations. To ensure accuracy, fluency, and context, it supports domain-specific and language-specific customizable terminology. Custom terminology helps organizations improve the accuracy and fluency of their translations, and Amazon Translate recently made enhancements to its custom terminology feature. Organizations can now use Amazon Translate to manage and customize up to 10 million terms. This customization enables organizations to use the terms they are familiar with and makes translations more accurate and natural.

Zero-Shot Text Classification With Amazon SageMaker JumpStart

Natural Language Processing (NLP) is an area of machine learning (ML) that focuses on teaching computers to understand text and spoken words like humans. Recent state-of-the-art architectures such as the transformer architecture are being used to achieve near-human performance on NLP downstream tasks such as text summarization, text classification, entity recognition, and more. Amazon SageMaker JumpStart offers a template for zero-shot text classification that enables developers to quickly build and deploy models. This template includes the “zero-shot” capability which allows organizations to train an AI model in an unsupervised manner, without the need for annotated training data.

Build a Centralized Monitoring and Reporting Solution for Amazon SageMaker

This post focuses on building a cross-account observability dashboard for monitoring SageMaker user activities and resources across multiple accounts. It allows end-users and cloud management teams to efficiently monitor ML workloads, view their status, and trace back different account activities at certain points of time. This centralized dashboard is built on Amazon CloudWatch, and it also provides features such as automatic CloudWatch setup, dashboards, and alerts.

Generate Creative Advertising Using Generative AI Deployed on Amazon SageMaker

Generative AI is revolutionizing creative advertising. By retraining a generative AI model and providing a few inputs, such as textual prompts, organizations can generate a wide variety of novel images for product shots. Amazon SageMaker provides a template for deploying generative AI models that are pretrained on images. This template provides a fast and efficient way to generate creative advertising without the need for large datasets or manual annotation.

Host the Spark UI on Amazon SageMaker Studio

Amazon SageMaker offers several ways to run distributed data processing jobs with Apache Spark. SageMaker Studio notebooks and AWS Glue Interactive Sessions can be used to run interactive Spark jobs with a serverless cluster. This post discusses how to set up the Spark UI on Amazon SageMaker Studio using this serverless cluster. It also explores the various types of data processing scenarios that can be implemented using this setup.

Deploy Thousands of Model Ensembles with Amazon SageMaker Multi-Model Endpoints on GPU to Minimize Hosting Costs

AI adoption is growing rapidly across industries and use cases. Recent scientific breakthroughs in deep learning, large language models, and generative AI are enabling organizations to use state-of-the-art solutions with performance that is comparable to that of humans. Complex models often require hardware acceleration to enable faster training and inference. Amazon SageMaker Multi-Model endpoints help organizations deploy thousands of model ensembles with GPU acceleration to minimize hosting costs.

AWS Performs Fine-Tuning on a Large Language Model (LLM) to Classify Toxic Speech for a Large Gaming Company

The video gaming industry has an extensive user base of over 3 billion worldwide. Unfortunately, some players don’t communicate in an appropriate manner. In order to create a socially responsible gaming environment, AWS performs fine-tuning on a large language model (LLM) to classify toxic speech. By using a pre-trained model, AWS is able to quickly and accurately detect and classify toxic speech in the gaming environment.

At KeyCore, we specialize in helping organizations maximize their AWS investments. Our team of AWS experts can help you integrate and optimize the solutions discussed in this blog post, as well as provide guidance on AI and ML projects. Contact us for more information.

Read the full blog posts from AWS

Announcements, Updates, and Launches

Recent AWS Storage News, Updates, and Launches

File Release for Amazon FSx for Lustre

Amazon FSx for Lustre provides a fully managed shared storage that is scalable and has high performance with the open-source Lustre file systems. FSx for Lustre is designed for workloads that require storage speed and performance. With FSx for Lustre, customers can avoid storage bottlenecks and increase the utilization of compute resources.

Welcome to AWS Storage Day 2023

The fifth annual AWS Storage Day took place on 9 August 2023. It was first hosted in 2019 and has become an innovation day that customers look forward to every year. Last year’s Storage Day post discussed how AWS Storage is a key part of the customer’s cloud journey, and this year’s Storage Day highlighted how customers can use storage to solve some of their most common challenges.

Mountpoint for Amazon S3

Mountpoint for Amazon S3 is an open source file client that makes it easy for Linux applications to connect to Amazon Simple Storage Service (Amazon S3) buckets. It was announced earlier this year as an alpha release and is now generally available and ready for production use. It supports large-scale read-heavy applications such as data lakes, analytics, and media and entertainment workloads.

Improved Amazon S3 Glacier Flexible Restore Time

Amazon S3 Glacier recently celebrated its 10th anniversary and has made many innovations over the last few years. The Amazon S3 Glacier storage classes provide customers with long-term, secure, and durable storage options at the lowest cost. The Amazon S3 Glacier Flexible Retrieval Tier now supports Standard Retrieval Tier, which can improve restore time by up to 85%.

AWS Weekly Roundup – 8 August 2023

The AWS Weekly Roundup for 8 August 2023 focused on AWS Storage Day, the new AWS Israel (Tel Aviv) Region, and more. AWS Storage Day highlighted how customers can use storage to solve some of their most common challenges and the launch of the AWS Israel (Tel Aviv) Region expands AWS’ global infrastructure on the continent of Asia.

Deliver Interactive Real-Time Live Streams with Amazon IVS

Live streaming is becoming an increasingly popular way to engage customers with brands and their favorite influencers. Amazon Interactive Video Service (Amazon IVS) is a fully managed live streaming solution and was introduced in March. It helps customers build interactive live stream and video experiences for their audiences. With Amazon IVS, customers can deliver interactive real-time live streams with low latency and high fidelity.

At KeyCore, we understand the importance of cloud storage in developing applications and services for our customers. We work with customers to identify their data storage needs and then assist in the development of a solution that meets their needs. Our team of AWS-certified consultants have the expertise to guide customers through the entire process, from identifying their requirements to launching their applications. Contact us today to learn more about how our team can help you.

Read the full blog posts from AWS

Containers

Container Infrastructure Provisioning with AWS Control Tower and AWS Proton

Most enterprise customers strive for centralized control and organization-wide policies when it comes to the distribution of cloud resources among multiple teams. These teams tend to fall into the three categories of IT operations, Enterprise Security, and App-development. The delivery of business value from the application standpoint lies in the hands of the App-development team, but this must be done in accordance with the policies established by the other two teams.

Using SBOM to Find Vulnerable Container Images Running on Amazon EKS Clusters

Similar to how you check the list of ingredients on a packaged food item from your local grocery store to understand what is inside and to make sure you aren’t consuming any accidental ingredients you don’t want, this same idea can be applied when you’re running container images on Amazon EKS clusters. By using Security Bill of Materials (SBOM), you can understand the components used to build a container image, which can help you more accurately assess the security posture of your running images.

Announcing Additional Linux Controls for Amazon ECS Tasks on AWS Fargate

An Amazon Elastic Container Service (Amazon ECS) task is a small collection of containers that are scheduled on either AWS Fargate or an Amazon EC2 container instance. Container workloads are isolated from each other through the use of Linux namespaces, even though the containers are scheduled together in the Amazon ECS task. Due to this, it is possible to add additional Linux controls to your Amazon ECS tasks running on Fargate, giving you more control and visibility over your running containers.

KeyCore Managed Services

At KeyCore, we provide professional and managed services for our customers when it comes to managing and building their container infrastructure. Our team of experienced AWS Consultants can help you build an effective and secure container infrastructure, taking care of the configuration and provisioning of your AWS Control Tower and AWS Proton. We can also ensure that your Amazon ECS tasks running on Fargate are using the latest security measures to keep your containers safe and secure.

Read the full blog posts from AWS

AWS Quantum Technologies Blog

Running Quantum Chemistry Calculations on AWS ParallelCluster

Quantum computing researchers who are seeking to compare their quantum or hybrid algorithms against classical calculations can now benefit from the combination of quantum computing and High Performance Computing (HPC) on AWS. Amazon Web Services (AWS) provides the Amazon ParallelCluster (APC) open source HPC application that enables researchers to quickly and cost-effectively calculate quantum chemistry results in the cloud.

Overview of APC

Amazon ParallelCluster is an open source cluster orchestration tool that makes it easy to deploy and manage HPC clusters in the AWS cloud. After a user has configured their desired cluster, APC handles all the work of setting up the cluster, including launching and configuring the compute nodes, and configuring the appropriate software stack on each one. APC also provides features for monitoring the health of the cluster and scaling it up or down as needed.

Running Quantum Chemistry Experiments

Using APC to run quantum chemistry experiments is fairly straightforward. First, you need to set up a cluster with the proper software stack. This includes installing the quantum chemistry application, such as Quantum Espresso, as well as any additional libraries or packages that are required. After the cluster is set up, the user can submit jobs that will be run on the cluster. The APC software will take care of distributing the jobs across the compute nodes and monitoring the progress of the job.

Benefits of APC

There are a number of advantages to running quantum chemistry experiments on an APC cluster. The first is cost savings. APC is much cheaper than running the same experiment on a traditional HPC cluster, since you only pay for the compute time you use on AWS. Additionally, APC clusters can be set up and torn down quickly, which makes it much easier to experiment with different configurations and try out different parameters without having to wait for a long time. Finally, APC offers the ability to scale up or down the cluster dynamically, so users can quickly add or remove compute nodes as needed.

How KeyCore Can Help

At KeyCore, we specialize in helping organizations leverage the power of quantum computing and HPC on AWS. Our experts can help you set up an APC cluster and configure it to run your quantum chemistry experiments. We can also help you with the development of quantum algorithms and provide ongoing support and maintenance of your cluster. Contact us today to learn more about how KeyCore can help you harness the power of quantum computing and HPC on AWS.

Read the full blog posts from AWS

AWS Smart Business Blog

Simplifying Digital Transformation for Small and Medium Businesses with AWS

Tara Palacios Empowers Arlington Businesses

When the COVID-19 pandemic hit, Tara Palacios of Arlington, Virginia (USA) knew she had to act fast. This bustling city of over 200,000 residents was greatly impacted by the pandemic, with small businesses suffering the consequences. Restaurants, shops, and other beloved pillars of the community needed a way to get online and stay afloat.

Tara saw the potential of the cloud to help small and medium businesses (SMBs) during this time and embarked on an ambitious mission to build an enterprise-level cloud infrastructure. Her goal was to enable businesses to quickly and cost-effectively get online and take advantage of the latest technologies and services.

Cloud-based Solutions Help SMBs Optimize Performance

Data is the currency of the future and SMBs need to collect data sets to create visualizations and analytics that can help them optimize performance and anticipate and react to change.

However, SMBs can find it hard to build the necessary data framework as they don’t have the time or resources to do so. That’s why Tara worked with AWS to create a cloud-based, secure, and reliable solution that would give SMBs the ability to quickly and easily get online.

Using AWS to Transform Arlington Businesses

Tara and AWS collaborated to help local businesses by providing a cloud-based platform that would enable them to quickly get online and access the latest technologies. With the help of AWS CloudFormation and AWS CloudFormation Designer, Tara was able to create a secure and reliable data framework that gave SMBs the ability to access the resources they needed to get online.

This platform allowed businesses to efficiently manage their operations, benefit from cost-effective scalability, and access services such as AWS Lambda, AWS Step Functions, and Amazon Elasticsearch Service. These tools enabled businesses to quickly and easily create digital storefronts, automate processes, and access powerful analytics tools to gain insights.

The Results

The innovative cloud infrastructure Tara created with the help of AWS resulted in a dramatic increase in the number of SMBs in Arlington that are now online. Additionally, the platform has enabled businesses to increase efficiency, optimize performance, and remain competitive during this difficult time.

How KeyCore Can Help

At KeyCore, we are experts in AWS technology and can help your business to make the transition to the cloud. Our professional services and managed services can help you to quickly set up your own enterprise-level cloud infrastructure and take advantage of the benefits the cloud can provide. Contact us today to find out how we can help your business get online and stay competitive.

Read the full blog posts from AWS

Official Database Blog of Amazon Web Services

Guide to Digital Asset Tokenization for Financial Services with Amazon Managed Blockchain

Part 1: Build a Digital Asset Tokenization Framework
Digital asset tokenization for financial services is gaining traction. This post series is a guide for financial services customers looking to learn more about the topic, and who may be considering building a digital asset capability for their own use cases.

Amazon Managed Blockchain is a fully managed service that makes it easy to create and manage secure blockchain networks in the cloud. Amazon Managed Blockchain simplifies the setup and maintenance of blockchain networks by providing an easy-to-use web console, preconfigured blockchain frameworks, and infrastructure management managed by the service. The service is ideal for use cases such as digital asset tokenization, where multiple parties need to securely transact assets without relying on a central authority.

Using Amazon Managed Blockchain, you can quickly and easily create a secure blockchain network that spans multiple AWS accounts and keep it up to date. You can also configure the blockchain network to use the resources of multiple AWS accounts, and you can manage permissions and access control. By leveraging Amazon Managed Blockchain, you can quickly build a digital asset tokenization framework to support financial services use cases.

Part 2: How to Manage Case-Insensitive Data in PostgreSQL
When performing queries or comparisons on text data in PostgreSQL, it is important to be aware of the case sensitivity of the data. PostgreSQL is case sensitive by default when sorting or comparing string values, meaning “amazon” and “Amazon” would not return the same results. You can configure the database to address this issue by using the LOWER() function to convert the strings to a case-insensitive format.

Part 3: Configure Amazon RDS Custom for Oracle using AWS CloudFormation and AWS Systems Manager for JD Edwards One-Click
Amazon Relational Database Service (Amazon RDS) Custom allows you to automate database administration tasks and operations. RDS Custom grants you access to customize your database environment and operating system, allowing you to meet the requirements of applications like legacy, custom, and packaged solutions.

One way to configure your database to meet these requirements is to use AWS CloudFormation and AWS Systems Manager for JD Edwards One-Click. With this method, you can define the details of your deployment in a CloudFormation stack, including the specific settings for your database. Once the stack is deployed, you can use Amazon RDS Custom to provision and configure the database.

Part 4: Retrieve Bitcoin and Ethereum Public Blockchain Data with Amazon Managed Blockchain Query
Public blockchain adoption has been driven by three primary use cases: decentralized finance (DeFi), non-fungible tokens (NFTs), and digital currency payments. Amazon Managed Blockchain Query makes it possible to interact with blockchains to retrieve data from these use cases using standard SQL. By leveraging Managed Blockchain Query, you can query and analyze public blockchain data to gain insights into the state of the blockchain network.

Part 5: Migrate Microsoft SQL Server SSIS Packages to Amazon RDS Custom for SQL Server
Microsoft SQL Server Integration Service (SSIS) is used to create, extract, transform, and load workflows by connecting to various data sources. SSIS allows you to configure the data for loading into the destination system, by copying, cleaning, and processing the data.

For customers using Amazon RDS Custom for SQL Server, it is possible to migrate these SSIS packages to Amazon RDS Custom. To do so, you can use the AWS Database Migration Service (AWS DMS) to migrate your data and the SSIS packages, as well as the DMS task settings to Amazon RDS Custom. Once the migration is complete, you can use the Amazon RDS Custom engine to execute SSIS packages.

Part 6: Amazon Aurora PostgreSQL: Cross-Account Synchronization using Logical Replication
You can use Amazon Aurora PostgreSQL-Compatible Edition to set up cross-account logical replication, using Aurora’s cross-account clone and PostgreSQL logical replication. This allows you to achieve near real-time synchronization between a source and a target database in different AWS accounts. You can customize the solution to meet your specific requirements, including selective replication and parallel replication.

Part 7: Remove Temporal Tables and History Objects while Migrating to Amazon DynamoDB using Amazon DynamoDB Streams
Customers may use custom database features like Microsoft SQL Server temporal tables and Oracle Flashback to store historical data or to record a change trail of contents. These temporal tables can be removed while migrating to Amazon DynamoDB using Amazon DynamoDB Streams.

Amazon DynamoDB Streams allows you to capture and process changes to your DynamoDB table items in near real-time. You can use the service to detect changes in your DynamoDB tables and trigger an AWS Lambda function to process those changes. This allows you to react in near real-time to the changes, including removing temporal tables and history objects.

How KeyCore Can Help
At KeyCore, our AWS Certified Solutions Architects can help you build a digital asset tokenization framework for your financial services use case using Amazon Managed Blockchain. We can also help you migrate your Microsoft SQL Server SSIS packages to Amazon RDS Custom, and remove temporal tables and history objects while migrating to Amazon DynamoDB using Amazon DynamoDB Streams. Contact us today to learn more.

Read the full blog posts from AWS

AWS Cloud Financial Management

AWS Cloud Financial Management

Measure and track cloud efficiency with sustainability proxy metrics

Sustainability has become an important decision-making factor for customers, employees, regulators, investors, and partners. With companies looking to reduce environmental impacts, optimizing IT infrastructure is a key step. To measure and track cloud efficiency, sustainability proxy metrics can help.

Specifically, cost optimization proxy metrics measure the efficiency of IT infrastructure as well as the effectiveness of cost optimization efforts. Cost, utilization, and availability are among the metrics tracked, as these can reveal potential savings. Furthermore, cost optimization proxy metrics can also provide insight into potential IT performance gains.

Tracking over time also provides a valuable measurement of progress. As changes in the IT environment are implemented, tracking cost optimization proxy metrics helps to assess how effective they are. With reports and insights, decisions can be made with better visibility into potential savings.

Cost optimization flywheel

Cost optimization is an ongoing process, and the cost optimization flywheel approach helps to ensure that savings continue to be leveraged. This process consists of four key steps. The first is cost transparency. Having visibility into where money is going and what is being spent on allows for a data-driven approach to decision-making.

The second step is cost optimization. Once the visibility is there, optimized cloud usage can be developed and implemented. This includes strategies like right-sizing resources, optimizing reserved instances, and using Spot Instances.

The third step is leveraging savings. By reinvesting savings into cloud-based innovation, additional efficiencies and competitive advantages can be gained. Examples of this include building infrastructure as code and automation, applying machine learning to optimize spend, and leveraging cost-effective services.

Finally, fourth step is continuous improvement. Measuring cost optimization proxy metrics on an ongoing basis helps to ensure that the process is effective and that improvement continues.

KeyCore Can Help

KeyCore provides professional and managed services to help organizations optimize their cloud spending and ensure that costs are transparent and tracked. Our services include cost transparency and optimization, cloud migration, and cost-effective development.

With our understanding of the cost optimization flywheel, we are uniquely positioned to guide organizations on their cloud journey. Our team of AWS experts can help to identify cost-saving opportunities, build and maintain cost transparency, and continually optimize cloud usage.

Together, we can ensure that your organization is achieving maximum savings and leveraging those savings to their fullest potential. Contact KeyCore to learn more about our services and cloud cost optimization today.

Read the full blog posts from AWS

AWS Training and Certification Blog

Introducing AWS Industry Quest: Healthcare

AWS recently launched AWS Industry Quest: Healthcare, a cloud skills training offering specifically for the healthcare industry. This comprehensive program provides healthcare professionals with an interactive learning experience, and covers 25 real-world AWS Cloud skills.

Training for the Healthcare Industry

Healthcare professionals are facing an ever-evolving landscape, and this program is designed to help them upskill and stay ahead of the curve. AWS Industry Quest: Healthcare provides healthcare employees with the tools and resources to build cloud-based solutions that can help their organization stay agile and responsive to the changing landscape.

The program is tailored to the unique needs of the healthcare sector, and covers topics such as medical informatics, predictive analytics, healthcare machine learning, and more. Additionally, AWS Industry Quest: Healthcare is designed to be engaging for all skill levels, from those just beginning to explore the AWS Cloud to experienced users who are looking to further their understanding of the cloud.

How KeyCore Can Help

At KeyCore, we are committed to helping healthcare organizations make the most of their cloud technology investments. As the leading Danish AWS consultancy, we provide both professional services and managed services for organizations of all sizes. Our team of experienced AWS experts can help you get the most out of this training and ensure you are leveraging the most up-to-date cloud technology and best practices.

To learn more about KeyCore and our offerings, please visit our website www.keycore.dk. We look forward to helping you leverage the power of the AWS Cloud.

Read the full blog posts from AWS

Official Big Data Blog of Amazon Web Services

Reduce Data Processing Costs and Monitor Data Pipelines with AWS

Customers are always looking for ways to optimize cost when implementing data processing workloads in the AWS Cloud. With technologies like Amazon EMR or serverless technologies like AWS Glue, customers can minimize the undifferentiated heavy lifting that goes into data processing. Ontraport was able to reduce their data processing costs by 80% using AWS Glue.

Introducing Apache Airflow Version 2.6.3 Support on Amazon MWAA

Organizations like Siemens, ENGIE, and Choice Hotels International are using Amazon Managed Workflows for Apache Airflow (Amazon MWAA) to enhance and scale their business workflows. Amazon MWAA is a managed orchestration service for Apache Airflow that makes it simple to set up and operate end-to-end data pipelines in the cloud. With version 2.6.3, users can now take advantage of features like the Kubernetes Executor, configurable Web UI authentication, and Rich command line experience.

Perform Amazon Kinesis Load Testing with Locust

Streaming applications operating at scale often handle large volumes of data, up to GBs per second, and it can be difficult to simulate the high-traffic Amazon Kinesis-based applications necessary to generate such load. Locust is an open-source tool that can simulate and measure the performance of these applications and Amazon Kinesis. It allows users to build performance tests quickly, measure many performance metrics, and create detailed reports.

Monitor Data Pipelines in a Serverless Data Lake

In order to build a serverless data lake, AWS serverless services can be used, including but not limited to AWS Lambda, AWS Glue, AWS Fargate, Amazon EventBridge, Amazon Athena, Amazon Simple Notification Service (Amazon SNS), Amazon Simple Queue Service (Amazon SQS), and Amazon Simple Storage Service (Amazon S3). These services provide mechanisms to ingest and transform data for data lakes, but also require users to monitor data pipelines to ensure data quality and data governance.

Configure SAML Federation for Amazon OpenSearch Serverless with Okta

OpenSearch is a fully managed search service from Amazon that allows customers to quickly and easily set up, manage, and scale a search experience for a website or application. Modern applications need to apply security controls across many systems and their subsystems, and this can be quite a challenge. Setting up SAML federation for Amazon OpenSearch Serverless with Okta can help solve this problem. Okta provides centralized identity management, which allows for a single identity provider (IdP) to authenticate actors and manage and distribute their rights.

Perform Time Series Forecasting with Amazon Redshift ML and Amazon Forecast

Amazon Redshift is used to process large volumes of data every day to power analytics workloads. With Amazon Redshift ML, users can apply machine learning models to data stored on Amazon Redshift. This makes it possible to use time series forecasting with Amazon Redshift and Amazon Forecast. Time series forecasting helps businesses analyze past patterns and forecast future trends, making it easier to make more informed decisions based on accurate predictions.

At KeyCore, we provide a range of professional services and managed services to help organizations reduce data processing costs and monitor data pipelines in a serverless data lake. We can help you understand the options available to you and implement the most effective solution for your business. We also have the expertise to help you configure SAML federation for Amazon OpenSearch Serverless with Okta and use time series forecasting with Amazon Redshift ML and Amazon Forecast. Contact us today to learn more.

Read the full blog posts from AWS

Networking & Content Delivery

The Benefits of Using Dual-Stack Accelerators with IPv6 EC2 Endpoints

AWS Global Accelerator now offers dual-stack accelerators that let users route both IPv4 and IPv6 traffic to Amazon Elastic Compute Cloud (Amazon EC2) instances as endpoints, in addition to Application Load Balancers. This offers several key benefits.

Improved Reliability and Availability

By routing traffic to both IPv4 and IPv6 endpoints, dual-stack accelerators helps improve reliability and availability. If one of the protocols fails, the other can take over and help ensure that your customers get the best possible experience when accessing your applications and services.

Better Performance

Using dual-stack accelerators also offers better performance. By including both IPv4 and IPv6 endpoints in the acceleration path, the latency of an application can be reduced significantly. This can help improve the user experience and increase customer satisfaction.

Simplified Management

Using dual-stack accelerators also simplifies management. Instead of having to manage both IPv4 and IPv6 endpoints separately, the dual-stack accelerator can handle both automatically. This can help reduce the complexity of managing the endpoints and help keep applications running smoothly.

Step-by-Step Guide for Adding IPv6 EC2 Endpoints to AWS Global Accelerator

AWS Global Accelerator makes it easy to add IPv6 EC2 endpoints to your accelerator. All you need to do is create a new accelerator, add your IPv6 EC2 endpoints as listeners, and then configure the routing for your accelerator. Here is a step-by-step guide to adding IPv6 EC2 endpoints to your AWS Global Accelerator:

  1. Create a new IPv6-only accelerator, or add an IPv6 listener to an existing accelerator.
  2. Configure your accelerator with the IPv6 listener by specifying the ports that you want to use.
  3. Configure the EC2 instance that you want to add as a listener endpoint by specifying the IP address and port for the listener.
  4. Configure the routing for your accelerator by specifying the traffic distribution for your listener endpoints.

Once you have configured your accelerator, the EC2 instance will be added as a listener endpoint and traffic will be routed to it in the same way as for an IPv4 endpoint.

KeyCore Can Help with Adding IPv6 EC2 Endpoints

Adding IPv6 EC2 endpoints to your accelerator can be a complicated process. KeyCore can help with this process by providing experienced AWS professionals who can help you configure your accelerator and ensure that it is set up correctly. We can also provide advice on how to optimize your accelerator for performance and reliability. Contact us today to learn more.

Read the full blog posts from AWS

AWS Compute Blog

Building High-Performance Windows Workstations on AWS for Graphics Intensive Applications

Applications like video editing, professional visualization, and video games often require high-performance Windows workstations with GPUs to be resource demanding. It is desirable to have a high-performance remote display protocol in order to access the instance’s graphical desktop from the remote client in order to provide a smooth experience.

Setting Up the Workstation

To set up a workstation on Amazon Web Services (AWS), users need an instance type with GPUs. AWS provides a wide range of instance types with different GPUs that can be used to build a high-performance Windows workstation. This includes Amazon EC2 G4 Instances, which are powered by NVIDIA T4 GPUs.

In addition to an instance type with GPUs, users need to configure the instance to be used as a workstation. This includes setting up the operating system, installing drivers, and configuring the remote display protocol. The remote display protocol will be used to access the instance’s graphical desktop from the remote client.

Configuring the Remote Display Protocol

AWS provides a range of remote display protocols that can be used to access the instance’s graphical desktop. These include VNC, RDP, and X2Go. These remote display protocols can be used to provide a smooth experience to the user.

To configure the remote display protocol, users need to install the appropriate drivers for the GPU and configure the remote display protocol. This includes setting up the port forwarding on the firewall and configuring the remote display protocol to allow access from the remote client.

Using Lambda Response Streaming to Optimize Performance

AWS Lambda Web Adapter can be used to package web applications that support Lambda response streaming. This can be used to improve the time to first byte (TTFB) for web pages and enhance the user experience and performance metrics of the web application.

To use Lambda response streaming, AWS Lambda Web Adapter must be installed on the instance. This will allow the instance to support Lambda response streaming. Once installed, users can configure the web application to use Lambda response streaming, which can be used to improve the performance of the web application.

How KeyCore Can Help

At KeyCore, we are experts in AWS and can help you build and configure high-performance Windows workstations on AWS for graphics intensive applications. We provide professional services that can help you set up the workstation, configure the remote display protocol, and use Lambda response streaming to optimize performance. Contact us today to find out how we can help you get the most out of AWS.

Read the full blog posts from AWS

AWS for M&E Blog

AWS Solutions Library Now Features Twelve New Partner Solutions for M&E

Amazon Web Services (AWS) has added twelve new Partner Solutions tailored for media and entertainment (M&E) applications to the AWS Solutions Library. The library offers purpose-built technologies, ready-to-deploy software packages, and customizable architectures offered by AWS and AWS Partners.

Multi-Region Rendering with Deadline and Hammerspace

Visual Effects (VFX) studios have gone global in the past few years, with satellite studios, remote artists, and co-location with production teams. Single-location render farms are no longer the norm, requiring new solutions to ensure efficient rendering across multiple regions. Deadline, a cross-platform render manager from AWS Thinkbox, provides a comprehensive solution for rendering M&E workloads across multiple regions.

Deadline allows VFX studios to submit jobs to render farms that span multiple AWS Regions, supporting high-performance on-demand rendering, autoscaling, and workload management. Hammerspace, a file controller system, provides high-performance file storage and access for distributed rendering workflows.

Scheduling Unreal Engine Pipelines with AWS Thinkbox Deadline

Epic Games’ Unreal Engine has revolutionized 3D content creation, manipulation, and rendering. With technology advances such as Lumen and Nanite in Unreal Engine 5, studios can generate high-quality visuals faster than ever before. AWS Thinkbox Deadline provides a powerful solution for scheduling and managing Unreal Engine pipelines.

Deadline integrates with Unreal Engine, allowing studios to scale the Unreal Engine pipeline across multiple AWS Regions, on-premises, or in hybrid configurations. It also supports autoscaling, rendering with GPUs, and sophisticated job management. With Deadline, studios can unlock the full potential of Unreal Engine and accelerate their production pipelines.

How KeyCore Can Help

KeyCore, the leading Danish AWS consultancy, provides professional and managed services to help media and entertainment customers get the most out of AWS. Our team of certified experts assists with setting up and running workloads on AWS, providing optimized architecture and cost-effective solutions. We can also configure, deploy, and manage partner solutions for M&E – including those found in the AWS Solutions Library – so that they are tailored to fit our customers’ specific requirements. To learn more about our services, please visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Storage Blog

Everything You Need to Know About AWS Storage

Disabling Amazon S3 Access Control Lists With S3 Inventory

Amazon Simple Storage Service (S3) was launched in 2006 with Access Control Lists (ACLs) as its primary authorization mechanism. In 2011, Amazon S3 also began supporting AWS Identity and Access Management (IAM) policies for managing access to S3 buckets, and it is recommended that customers use IAM policies instead of ACLs whenever possible.

To move away from ACLs, customers can use the recently announced S3 Inventory feature to list all the objects in their buckets. S3 Inventory can be used to analyze all the ACLs in a customer’s bucket and find objects that have overly permissive or restrictive ACLs. Additionally, S3 Inventory can be used to generate an audit trail of the objects within a customer’s bucket.

Migrating DigitalOcean Spaces to Amazon S3 Using AWS DataSync

Organizations sometimes need to move large amounts of object data from one cloud service provider to another for various reasons, such as data consolidation, workload migration, disaster recovery planning, or cost optimization. In order to migrate DigitalOcean Spaces to Amazon S3, customers can use AWS DataSync to copy the data between the two services.

AWS DataSync provides customers with several key elements needed for a successful migration. This includes full encryption of data being transferred, the ability to track the progress of the migration, and the ability to pause and resume transfers as needed. AWS DataSync also allows customers to use their existing storage class policies to determine when to move data between services.

Simplify Multicloud Data Movement With AWS DataSync

At AWS, customers get the best experience, performance, and cost when they run their IT operations in the cloud. However, some customers may find themselves in a multicloud environment due to reasons such as acquisitions or data sharing agreement. In these cases, AWS DataSync can be used to seamlessly move data between cloud services.

AWS DataSync’s key features include data encryption, trackable progress, and the ability to pause and resume transfers. An added benefit of AWS DataSync is that customers can use their existing storage class policies to determine when to move data between services.

Introducing AWS Backup Logically Air-Gapped Vault

Data loss events from ransomware or account compromise remain a top concern for customers. AWS Backup has now introduced the Logically Air-Gapped Vault to help customers share recovery points stored in AWS Backup with other accounts. This includes across organizations to facilitate faster direct restores. Additionally, customers need to maintain access to the original AWS Key Management Service (KMS) Customer Master Keys (CMKs) to decrypt the data.

The Logically Air-Gapped Vault feature also allows customers to maintain control over the encryption keys used for their data. This means that AWS administrators do not have access to the customer’s data, and customers have the ability to revoke access whenever needed.

Reduce Recovery Time and Optimize Storage Costs With Faster Restores From Amazon S3 Glacier Storage Classes and Commvault

Data is increasingly important for businesses, and organizations are storing more copies of their application data than ever before. This helps them recover from data loss, repair data corruption or ransomware damage, respond to compliance requests, and become more data driven. To reduce recovery time and optimize storage costs, Amazon S3 Glacier Storage Classes and Commvault can be used to enable faster restores.

Amazon S3 Glacier Storage Classes support fast restores and allow customers to choose a retrieval option that meets their performance needs and budget. This includes retrieval options such as Expedited, Standard, and Bulk, which can be used to retrieve data ranging from 1 MB to 5 TB in size. Commvault also provides customers with additional options for restoring data, such as restoring a single file or a group of files.

At KeyCore, we are experts in AWS Storage and can help you optimize your cloud storage for the best performance, cost, and security. Our team of AWS Certified Consultants can help you configure and migrate your data, configure AWS Backup, and help you ensure your data is secure. Contact us today to find out how we can help you get the most out of your cloud storage.

Read the full blog posts from AWS

AWS Developer Tools Blog

S3 Cross-Region Access: Leverage AWS SDK for Java 2.x for Easier Inter-Region Data Access

The AWS SDK for Java 2.x team is excited to introduce a new feature: Amazon Simple Storage Service (Amazon S3) Cross-Region Client. This feature makes it much easier to access buckets in different AWS Regions with a single client.

What is the AWS SDK for Java 2.x?

The AWS SDK for Java 2.x is a collection of software development kits for Java applications that integrate with the Amazon Web Services (AWS) platform. With this SDK, developers can take advantage of AWS services such as Amazon S3, Amazon EC2, Amazon DynamoDB, and Amazon SQS. It also provides access to other AWS services such as AWS Service Catalog, AWS CloudFormation, and AWS Lambda.

How Does S3 Cross-Region Client Work?

The new S3 Cross-Region Client makes it easier to access buckets in different AWS Regions. With this feature, developers no longer need to manually switch between regions for their S3 buckets. Instead, they can now configure a single client for cross-region access and use it to access buckets in different regions.

Moreover, the Cross-Region Client supports multiple authentication mechanisms, including AWS Identity and Access Management (IAM) and Amazon Cognito. This makes it easier to securely access buckets in different regions.

Benefits of S3 Cross-Region Client

The S3 Cross-Region Client helps developers easily access buckets in different regions. This simplifies the development process and reduces the time spent on manually switching between buckets and regions. Moreover, the Cross-Region Client offers better performance than the previous version of the SDK.

KeyCore’s AWS Expertise

At KeyCore, we are experts in Amazon Web Services. Our team of professionals provides hands-on consulting and managed services to help your business leverage the AWS cloud. We can help you get the most out of the S3 Cross-Region Client and other AWS services. Contact us today to learn how we can help you set up and use the S3 Cross-Region Client.

Read the full blog posts from AWS

AWS Architecture Blog

How To Use Reusable ETL Frameworks And Monitor AWS Health Alerts At Scale To Build A Serverless Retail Solution For Endless Aisle On AWS

The Benefits of Data Lakes and Lake House Architectures

Data lakes and lake house architectures are becoming increasingly important components of the data platforms used by organizations. With them, however, come many challenges when developing and integrating them with various source systems. This blog will cover these challenges and provide insight into how a reusable ETL framework can help.

Data lakes provide easy access to data in its raw form. This makes it easy to consume, manage, store, and analyze data in the same way. Furthermore, lake house architectures allow data lakes to be integrated with data warehouses, enabling the data to be stored and managed in a way that allows for seamless access and sharing among various systems.

Using a reusable ETL framework can help address the challenges faced when developing a lake house architecture. This framework creates a repeatable process to move data from source systems to the data lake, making it easier to integrate with multiple systems. It also simplifies the process of making changes to the underlying data lake architecture in the future.

Monitoring and Tracking AWS Health Alerts

Thomson Reuters is a leading provider of business information services, and it is committed to its “cloud first” strategy on AWS. As such, they needed to find a way to monitor and track AWS Health alerts at scale.

To address this challenge, Thomson Reuters developed a custom alerting system based on the AWS CloudWatch Events and Amazon SNS. This system allows them to monitor and track AWS health alerts and receive notifications when events occur.

Furthermore, they have implemented a system for analyzing the data collected from the AWS Health events to ensure that the information is properly monitored and tracked. This system includes an automated process for collecting, processing, and analyzing the data, as well as a web-based dashboard to monitor and report on the data.

Building A Serverless Retail Solution for Endless Aisle

Retailers typically handle order fulfillment from start to finish, including inventory management, warehouses, and supply chains. But not all retailers are able to carry additional inventory. The “endless aisle” business model is a solution for lean retailers that want to avoid missing out on revenue.

This solution uses a serverless architecture on AWS to provide an endless aisle experience to customers, by allowing them to order items that are not in-store but are available from a warehouse. The solution includes AWS services such as Amazon API Gateway, Amazon S3, AWS Lambda, and Amazon DynamoDB to enable customers to make orders and view product information.

KeyCore Can Help

At KeyCore, we have extensive experience in helping our clients build, manage, and maintain cloud-based solutions. Our services range from professional services, such as helping clients design and implement a reusable ETL framework, to managed services, such as monitoring and tracking AWS Health alerts.

We can help you leverage AWS services to build a serverless retail solution for your endless aisle. We have the expertise to provide you with the insights and tools you need to ensure that your solution is optimized for success. Contact us today to get started.

Read the full blog posts from AWS

AWS Partner Network (APN) Blog

Realize Faster Time to Value with Modern Data Fabric Platforms on AWS

Many organizations are still managing expensive single-use case data environments, making it difficult to realize the full business value of their data. IBM’s Modern Data Accelerators on AWS can help by building a modern implementation of a data fabric architecture. This architecture standardizes data integration across the enterprise, enabling customers to realize faster time to value.

Build and Deploy Secure AI Applications with AIShield and Amazon SageMaker

Adversarial machine learning (AML) attacks, involve malicious attempts to manipulate or compromise machine learning models. AIShield has been integrated within the Amazon SageMaker environment, helping to alleviate AI security concerns. It mitigates risks before and after deployment, enabling customers to develop and deploy AI applications with confidence.

Leverage AWS Analytic Services and HCLTech Frameworks for OLAP Solutions

Online analytical processing (OLAP) is a method for organizing datasets for quick analysis, providing deeper insights to decision makers. AWS analytic services and migration tools, together with HCLTech frameworks, can be used to orchestrate OLAP solutions. This approach helps customers reduce costs, enhance performance, and increase business agility.

161 AWS Partners Added or Renewed with Designations Spanning Workload, Solution, and Industry

In July, 161 AWS Partners received either new or renewed designations in the global AWS Competency, AWS Managed Service Provider (MSP), AWS Service Delivery, and AWS Service Ready programs. Designations span workload, solution, and industry, helping AWS customers identify the right AWS Partners for core business objectives. AWS Partners focus on customer success, helping customers take full advantage of the benefits of AWS.

Demystifying Mainframe Modernization with Best Practices from AWS and Accenture

Organizations looking to modernize their mainframe applications can move them to distributed or cloud platforms to reduce costs, enhance performance, and increase business agility. The Accenture AWS Business Group (AABG) has simplified this complexity with its Mainframe Zero Approach. Accenture provides comprehensive solutions to migrate and manage operations on AWS.

Streamline Secrets Management for Enhanced Security with CyberArk Secrets Hub and AWS

Organizations building on AWS need to rely on the native AWS Secrets Manager for development and operations. A jointly developed solution between CyberArk and AWS has been designed to centralize control of secrets, automate rotation, and eliminate vault sprawl. This solution makes no changes to developer workflows while providing enhanced security.

Simplify, Optimize, and Automate Cloud Operations with Kyndryl Cloud Native Services for AWS

Kyndryl Cloud Native Services for AWS (KCNS) is designed to accelerate and automate managed services for workloads leveraging AWS-native services. KCNS provides a web interface, Control Plane, for users to perform various cloud operations. This post explains how to simply, optimize, and automate cloud operations with KCNS for AWS.

InovCares Leverage Avahi to Scale AWS Infrastructure

As applications go to market and customer workloads increase, a higher level of cloud expertise is needed to tune the AWS infrastructure. Avahi Technologies and AWS have partnered to help InovCares take on the challenge of tuning a cloud infrastructure. Avahi was well suited for this project, helping customers compete at speed by leveraging a cloud-first strategy.

Building a Secure, Reliable, and Scalable Chainlink Environment on AWS

A blockchain technology provider sought TrackIt’s assistance to deploy a scalable Chainlink environment on AWS. TrackIt leveraged a suite of AWS services to implement a customized Chainlink workflow. This post provides guidance to deploy a secure, reliable, and scalable Chainlink environment on AWS.

Tech Mahindra’s BMC Helix ITSM Deployment on AWS for End-to-End Data Protection

Tech Mahindra transitioned a segment of telecom customers from a legacy monolithic application to a self-managed microservices application utilizing Kubernetes operations and end-to-end data protection on AWS. Tech Mahindra is an AWS Premier Tier Services Partner with the Migration Consulting Competency.

Best Practices from OPSWAT to Secure AWS Applications from File-Borne Threats

The AWS Shared Responsibility Model requires security architects to take proactive measures to detect and prevent zero-day risks and other malware at the perimeter of their network. This post provides guidance on modern threat prevention technologies such as OPSWAT MetaDefender, helping to automate cybersecurity defense.

At KeyCore, we understand how difficult it can be to navigate through the complexity of AWS services. Our expert consultants are experienced in using AWS services to build secure, reliable, and scalable solutions. We specialize in helping customers take full advantage of the benefits of AWS, and ensure their applications are protected from file-borne threats. Reach out to us today to learn more about how we can help your organization succeed in the cloud.

Read the full blog posts from AWS

AWS HPC Blog

Automating Job Scheduling with AWS Batch and AWS Fargate

In this blog post, we will explore how to automate job scheduling for Docker containers using AWS Batch, AWS Fargate, and Amazon EventBridge. The solution we discuss here is fully managed, serverless, and event-driven, meaning it is ideal for organizations that need to schedule batch jobs for containers.

What is AWS Batch?

AWS Batch is a service for running batch computing jobs on AWS. It allows users to create, queue, and run batch computing jobs with ease. AWS Batch provides a way to run batch computing jobs in the cloud, freeing up resources for other workloads. AWS Batch allows users to define job dependencies, set up job retries, and set up job timeouts.

What is AWS Fargate?

AWS Fargate is a serverless compute engine for running Docker containers on AWS. Fargate allows users to run their containers without the need to manage servers. Fargate provides a cost-effective and highly scalable solution for running containers in production. Fargate can automatically scale up and down to meet changing workload demands.

What is Amazon EventBridge?

Amazon EventBridge is a serverless event bus that allows users to easily connect applications together. EventBridge can be used to trigger automated workflows or integrate applications with services, like AWS Batch and AWS Fargate. EventBridge is designed to be highly reliable and is backed by Amazon’s SLA of 99.9% availability.

How Does It Work?

Using AWS Batch, AWS Fargate, and Amazon EventBridge, users can create a job scheduling solution for their containers. First, users define their job definitions in AWS Batch and set up their job dependencies. Next, users set up Amazon EventBridge to trigger the job scheduling. Finally, users set up their AWS Fargate tasks to run the job. The job is then triggered by the Amazon EventBridge event, and the task is executed on AWS Fargate.

Benefits of Using AWS Batch, AWS Fargate, and Amazon EventBridge

Using AWS Batch, AWS Fargate, and Amazon EventBridge to automate job scheduling for containers provides a number of benefits.

First, the solution is fully managed, meaning users don’t have to manage any infrastructure. This makes the solution easier to set up, maintain, and scale.

Second, the solution is serverless, meaning users don’t have to worry about managing or scaling servers. This makes the solution more reliable and cost-effective.

Finally, the solution is event-driven, meaning it can be triggered by a variety of events. This makes it easier for users to integrate their applications and services.

How Can KeyCore Help?

At KeyCore, we provide professional services and managed services to help customers leverage the power of AWS. Our team of experienced AWS Certified Solutions Architects can help you set up and maintain a job scheduling solution for your containers. We can help you define job definitions, set up job dependencies, and set up your AWS Fargate tasks. We can also help you integrate your applications and services with Amazon EventBridge. Contact us today to learn more about how we can help you automate job scheduling for your containers.

Read the full blog posts from AWS

AWS Cloud Operations & Migrations Blog

AWS Named as a Challenger in the 2023 Gartner Magic Quadrant for Application Performance Monitoring and Observability

AWS is proud to have been named as a Challenger for the second consecutive year in the 2023 Gartner Application Performance Monitoring (APM) and Observability Magic Quadrant. Gartner evaluates vendors based on their Ability to Execute and Completeness of Vision, and the report provides insight into a vendor’s capabilities as well as their approach to the market.

APM and observability tools are powerful analytics platforms that ingest multiple data sources to provide visibility into the performance and health of applications. These tools provide insight into application performance, infrastructure performance, and customer experience, enabling teams to identify and fix performance issues quickly. With AWS, customers can benefit from the scalability and agility of the cloud, and tools such as Amazon CloudWatch and AWS X-Ray, while also taking advantage of the breadth and depth of third-party tools available in the AWS Marketplace.

Evaluate Custom Configurations Using AWS Config Custom Policy Rules and the Open Source Sample Repository

Organizations that have custom configuration requirements for their resources may find it challenging to compare actual resource configuration settings against their established requirements. To help customers address this challenge, AWS has published a new public repository of sample AWS Config custom rules using AWS CloudFormation Guard. This repository contains custom policy rules that customers can use to evaluate their resource configurations.

AWS Config is a fully managed service that enables customers to assess, audit, and evaluate the configurations of their AWS resources. Additionally, AWS Config Custom Rules allow customers to define their own configuration settings for resources and manage the compliance of those resources in the AWS environment. The sample repository provides a collection of reusable rules that customers can use to get started with AWS Config.

Using the open source sample repository, customers can quickly evaluate their resources against the custom rules they have defined. This helps to ensure resources meet their desired configuration and any potential issues can be identified and remediated quickly, reducing costs and improving the customer experience.

At KeyCore, we understand the complexities of managing custom configuration requirements and assessing resource compliance, and are here to help. Our experienced AWS engineers have the skills and knowledge needed to help you get the most out of AWS Config and CloudFormation Guard. Whether you need help setting up your own custom policy rules, or if you need guidance on the best practices for evaluating resource configurations, our team is here to help. Contact us today to find out more.

Read the full blog posts from AWS

AWS for Industries

How AI and AWS Help Industries Optimize Demand Forecasting and Boost Sell-Through

Understanding customer needs and preferences is a science. To craft the perfect product mix, businesses need to dive deep into customer data and extract valuable insights. Amazon Web Services (AWS) offers merchandising and planning solutions which give businesses the ability to make data-driven decisions and optimize their demand forecasting.

Next Generation CPE Command and Control Architectures on AWS

Communication Service Providers (CSPs) manage millions of Customer Premises Equipment (CPE) such as broadband routers and Wi-Fi gateways. The Broadband Forum’s TR-069 (CPE WAN Management Protocol or CWMP) is the traditional standard for device management and configuration. However, it has limited capabilities and cannot support more advanced use cases.

How an Open Hardware Platform for Automotive Applications Can Transform the Industry

Similar to the story of the IBM PC, an open hardware platform for automotive applications can create an environment driving innovation, disruption, and transformation for the automotive industry. This platform offers advantages for the first automotive company to adopt it, such as greater robustness, lower costs, and improved scalability. Furthermore, it increases the security and privacy of data generated by vehicles.

For businesses looking to leverage AI and AWS to optimize their demand forecasting and boost sell-through, KeyCore can offer specialized professional and managed services. Through our extensive knowledge of the AWS platform, we can help you identify the best solutions for your specific use case. Likewise, our team has the experience to help you migrate to a next-generation CPE system with better security, privacy, and scalability. We can also provide assistance regarding open hardware platforms for automotive applications, helping you take advantage of the opportunities it offers. Contact us today to learn more.

Read the full blog posts from AWS

AWS Messaging & Targeting Blog

Send SMS Using Amazon Pinpoint Configurations Sets

This blog post explains how to use Amazon Pinpoint configuration sets to send SMS messages. Configuration sets make it easy to manage various use-cases such as marketing and One-Time Password (OTP) or Multi-Factor Authentication (MFA) from one place. In this post, we’ll walk through how to use configuration sets with Amazon Pinpoint.

Create a Configuration Set

The first step is to create a configuration set. This can be done through the Amazon Pinpoint console. Begin by clicking on the “Configuration sets” option in the left-hand menu. Then, click on the “Create a configuration set” button. This will open a dialog where you can give the configuration set a name and description. After that, click “Create configuration set” to save the configuration set.

Add Opt-Outs to the Configuration Set

The next step is to add opt-outs to the configuration set. This can be done through the Amazon Pinpoint console. Begin by clicking on the “Opt-outs” option in the left-hand menu. Then, click on the “Create an opt-out” button. This will open a dialog where you can enter the phone number, as well as a description for the opt-out. After that, click “Create opt-out” to save the opt-out.

Send an SMS Message with the Configuration Set

Once the configuration set is created and opt-outs are added, you can send an SMS message with the configuration set. This can be done through the Amazon Pinpoint console. Begin by selecting the configuration set you created in the previous step. Then, click on the “Send an SMS message” button. This will open a dialog where you can enter the phone number, as well as the message to be sent. After that, click “Send message” to send the message.

How KeyCore Can Help

At KeyCore, we understand the importance of messaging and targeting in today’s digital landscape. Our team of AWS certified engineers can help you to set up Amazon Pinpoint and configure configuration sets for maximum efficiency. Contact us today to learn more about our services.

Read the full blog posts from AWS

The latest AWS security, identity, and compliance launches, announcements, and how-to posts.

AWS Identity Solutions Team: What They Do and How KeyCore Can Help

The AWS Identity Solutions team is a group of specialist solutions architects and engineers working on AWS security, identity, and compliance. The team is led by Senior Manager Ilya Epshteyn, and Principal Solutions Architect Remek Hetman is part of the team. In this post, we explore the work of the Identity Solutions team and discuss how KeyCore can help.

What Does the Identity Solutions Team Do?

The Identity Solutions team builds solutions for secure authentication and authorization (AuthN/AuthZ) for customers on AWS. This includes providing secure access to AWS services, creating roles to manage user permissions, and building federated identity solutions. The team also develops technologies for data protection, including encryption and key management systems, as well as solutions for providing secure access to applications and data.

How KeyCore Can Help

KeyCore is a leading provider of professional AWS services and managed services. We specialize in helping customers with their AWS security, identity, and compliance needs. Our team can help you assess your security posture, plan your identity and access control strategy, and develop solutions using AWS identity services. In addition, we can help you implement data protection measures such as encryption and key management, and create authentication systems for secure access to applications and data.

To learn more about our services and how we can help you with your AWS security, identity, and compliance needs, visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

Business Productivity

Introduction to Building a Dashboard for Amazon Chime SDK Voice Connectors

Amazon Chime SDK Voice Connector is a cloud-based service that provides a Session Initiation Protocol (SIP) trunking for voice calling. With the help of Amazon CloudWatch’s powerful monitoring capabilities, you can proactively identify and address any issues that may arise, ensuring high-quality and uninterrupted voice communication. By creating a dashboard for Amazon Chime SDK Voice Connectors, system admins can monitor the overall health of their voice communication system, as well as look into specific metrics for troubleshooting and debugging.

Benefits of Monitoring Amazon Chime SDK Voice Connectors with CloudWatch

Monitoring Amazon Chime SDK Voice Connectors with CloudWatch provides several benefits. It allows admins to be proactive in identifying and addressing problems, as well as gain better understanding of the system’s overall health. This is especially important for voice communication systems, which can be adversely affected if any of the underlying components are not functioning properly.

By using CloudWatch, admins can: Monitor the overall health of their voice communication system. Troubleshoot and debug specific metrics. Receive alerts when certain conditions are met. View performance data and metrics in real-time. Identify potential issues and take preventative measures.

How KeyCore Can Help

At KeyCore, our team of AWS professionals can help you create a dashboard for Amazon Chime SDK Voice Connectors using CloudWatch. Our experts can help you optimize your voice communication system to ensure it is always running smoothly and efficiently. We can provide guidance on the best practices for monitoring your voice communication system, as well as help you identify and address any underlying issues. Our team can also help you troubleshoot and debug any problems that arise. With our help, you can ensure the highest quality of voice communication.

Read the full blog posts from AWS

Front-End Web & Mobile

Understanding Server-Side Rendering (SSR) and Static Site Generation (SSG) with Next.js

Next.js is a popular React framework that is changing the way developers build modern web applications. It offers powerful features, such as Server-Side Rendering (SSR) and Static Site Generation (SSG), that optimize the performance and user experience of your application. But which one should you choose?

What is Server-Side Rendering (SSR)?

Server-Side Rendering (SSR) is a process where the server generates the HTML for a webpage from a provided template and content. When a user visits a page, the server will dynamically generate the HTML and return it to the user’s browser. This allows for dynamic content that changes based on user input to be displayed on the page.

What is Static Site Generation (SSG)?

Static Site Generation (SSG) is a process that generates static HTML pages from content and templates. This allows for content to be generated without relying on server-side code and can even be done ahead of time. SSG is typically used for sites that are content-heavy and don’t require a lot of user data.

Advantages of SSG vs. SSR

The advantages of SSG over SSR are that it is faster, more secure, and more cost-effective. Since the HTML is generated ahead of time, there is no need for the server to generate the HTML on every page load, resulting in faster page loads. Additionally, since the code is static, there is no risk of malicious code being injected into the HTML. Finally, since there is no server-side code being run, the cost of running an SSG-powered website is much lower.

Advantages of SSR vs. SSG

The advantages of SSR over SSG are that it is more dynamic and allows for more customization. Since the HTML is generated on the server at the time of the page load, it allows for dynamic content to be served up to the user based on user input. Additionally, since the server is generating the HTML, it allows for more customization of the output.

When Should You Use SSG or SSR in Next.js?

When deciding between SSG and SSR in Next.js, it is important to consider the type of website you are building. If you are building a content-heavy website that doesn’t need to dynamically change based on user input, then SSG is the best choice. On the other hand, if you need to serve up dynamic content based on user input, then SSR is the better option.

KeyCore Can Help

At KeyCore, our team of experienced AWS consultants can help you determine which rendering approach is best for your Next.js web application. We can help you leverage the power of AWS to optimize your application’s performance and user experience. We have the expertise to help you set up SSR and SSG in Next.js, as well as implement CloudFormation YAML, the AWS API using Typescript, and the AWS SDK for JavaScript v3. Contact us today to learn more about how we can help your business succeed.

Read the full blog posts from AWS

AWS Contact Center

AWS Contact Center – Gartner Recognizes AWS as a Leader in 2023 Magic Quadrant for Contact Center as a Service

AWS has been named a leader in the 2023 Gartner Magic Quadrant for Contact Center as a Service, a recognition based on Amazon Connect, the AI-powered cloud contact center launched in 2017. With this recognition, AWS is now in a position to provide organizations of all sizes with the Cloud resources needed to power Next-Generation Contact Centers.

What is Contact Center as a Service?

Contact Center as a Service (CCaaS) is an all-in-one communications tool, allowing organizations to provide customer service over the phone and via digital channels such as SMS, email, web chat, and social media. It includes a suite of features such as automated call routing, voice recognition, and AI-driven analytics.

How Does it Benefit Organizations?

By leveraging Cloud resources, organizations can reduce hardware and telecommunications costs, improve customer service, and improve operational performance. AWS’s recognition as a Leader in the 2023 Gartner Magic Quadrant for CCaaS is a testament to the stability and scalability of Amazon Connect and Cloud-based contact center solutions.

Event Based Outbound Campaigns with Amazon Connect

Organizations use contact centers to answer inbound calls and initiate outbound communication to their customers. Outbound campaigns can include appointment reminders, telemarketing, subscription renewals, billing reminders, and follow-up calls. Amazon Connect allows organizations to customize communication channels and use event-based triggers to initiate outbound campaigns. Additionally, Amazon Connect provides features such as automated call routing, voice recognition, and AI-driven analytics.

How KeyCore Can Help

At KeyCore, our team of AWS certified consultants can provide guidance and expertise to help organizations transition their contact centers to the Cloud. We specialize in Amazon Connect and can help with the setup, integration, and optimization of your contact center. Our managed services team can also provide 24/7 monitoring and help desk support to ensure that your contact center performs optimally. Contact us today to learn more about how KeyCore can help your organization.

Read the full blog posts from AWS

Innovating in the Public Sector

Innovating in the Public Sector with AWS

Scaling Intelligent Document Processing Workflows with AWS AI

As the daily volumes of document submissions grows for government organizations, it becomes increasingly important to process these documents quickly and accurately. Intelligent Document Processing (IDP) solutions built on AWS AI services can be used to meet these requirements. When the processing volume exceeds the resources available in a single AWS Region, organizations can distribute the workload across multiple Regions to increase the throughput of document processing.

This post presents high-level architecture guidance on building a distributed document processing workload with Amazon Comprehend. This solution is designed to handle unpredictable request patterns, ensuring that no delays or other impacts will be experienced by the user.

Extracting, Analyzing, and Interpreting Information from Medicaid Forms with AWS

Paper-based Medicaid forms often require significant manual effort to process and analyze. Using AI and ML services from AWS, Medicaid agencies can create a streamlined solution that can process paper forms at the same speed as digital forms. By extracting, analyzing, and interpreting the relevant information from paper-based Medicaid claims forms, agencies can gain valuable insights in near real-time. With this post, you can learn how to take advantage of this powerful technology.

New Immersive Cloud Training to Advance Public Sector Healthcare and Beyond

In response to customer and partner feedback, AWS has launched AWS Industry Quest: Healthcare. This interactive learning experience teaches developers how to build cloud solutions that can benefit healthcare customers across the globe. It was developed and tested with customers and partners in the Latin America, Canada, and Caribbean (LCC) region.

AWS Launches $20 Million K12 Cyber Grant Program

AWS is joining the White House, the Department of Homeland Security, the Department of Education, and other leaders in government, industry, and the education community to improve the cybersecurity resilience of K12 education communities. In addition to our existing collaborations with K12 education communities, state departments of education, teaching and learning companies, and EdTechs, AWS is committing $20 million for a K12 Cyber Grant Program that is available to all K12 school districts and state departments of education.

How KeyCore Can Help

At KeyCore, we understand the importance of AWS for public sector organizations. As the leading Danish AWS consultancy, we provide professional and managed services to ensure that your organization is able to take advantage of the latest AWS technology. Our team can help you develop and deploy AWS-based solutions, such as the Intelligent Document Processing (IDP) workload described above. We can also assist you in setting up the cloud infrastructure for the K12 Cyber Grant Program. With our expertise, you can maximize the efficiency of your public sector organization. Contact us today to learn more.

Read the full blog posts from AWS

AWS Open Source Blog

OCSF Is A Security-Focused Open Source Project

The Open Cybersecurity Schema Framework (OCSF) is an open-source project focused on simplifying the process of managing security telemetry data. It was developed to make it easier for security practitioners to create solutions that require secure data handling.

In celebration of OCSF’s one-year anniversary, the team released the OCSF v1.0.0 version of the project. In the year since its release, OCSF has had a great impact on the security industry.

The Project Has Impacted Security Teams Around the Globe

The OCSF project has been used by security teams throughout the world to quickly develop solutions that require secure data handling. This project has made it simpler to meet the demands of the security industry, and has helped to make the world a safer place.

One of the key features of OCSF is that it provides an easy-to-use schema for collecting and managing security telemetry data. By making the process of collecting and managing security data easier, OCSF has reduced the amount of time and effort that security teams need to spend managing their data.

OCSF Has Enabled Security Teams To Streamline Their Efficiency

With OCSF, teams can quickly create solutions that require secure data handling. It provides a way for teams to quickly collect, store, and manage their security data. Additionally, they can also use OCSF to quickly create and integrate new security solutions into their existing infrastructure.

By using OCSF, security teams can quickly develop new solutions and streamline their efficiency. This project has made it easier for teams to develop solutions that require secure data handling.

KeyCore Can Help You Take Advantage of OCSF

At KeyCore, we are your go-to source for AWS-based solutions. Our team of experienced AWS professionals can help you take advantage of the OCSF project and create solutions that require secure data handling.

We can also provide you with the expert advice you need to develop secure data handling solutions quickly and efficiently. Our team of AWS-certified consultants can help you develop the right solution for your security needs and ensure that your data remains secure.

When you need the best AWS-based solutions, KeyCore is here to help. Contact us today to learn more about how we can help you take advantage of the OCSF project and create secure data handling solutions.

Read the full blog posts from AWS

Scroll to Top