Summary of AWS blogs for the week of monday Mon May 29

In the week of Mon May 29 2023 AWS published 106 blog posts – here is an overview of what happened.

Topics Covered

Desktop and Application Streaming

Cloud Computing for Growing Businesses with AWS Desktop and Application Streaming

Desktop and Application Streaming services from AWS provide a secure, reliable, agile, and cost-optimized solution for businesses to scale their growing needs. In this article, we’ll look at how SoFi, a financial technology company, utilizes Amazon WorkSpaces and AWS Systems Manager to meet their customer demand. Additionally, we’ll discuss the free workshops provided by AWS End User Computing (EUC) services.

Managing Amazon WorkSpaces at Scale with AWS Systems Manager

SoFi, a financial technology company providing a range of financial products and services, needed to scale to meet their growing business needs. Amazon WorkSpaces gave them the ability to quickly and easily provision virtual desktops and applications to a distributed workforce. AWS Systems Manager provided visibility and control of their environment, allowing SoFi to better manage and maintain their Amazon WorkSpaces. With the increased visibility and control, SoFi was better able to scale their Desktop as a Service (DaaS) environment, while also improving security, user experience, and operational savings.

AWS Systems Manager allowed SoFi to keep their Amazon WorkSpaces secure, reliable, and up-to-date. Security compliance automation was achieved by using AWS Systems Manager to detect misconfigurations and missing patches. Automated patching was enabled by setting up automated patch baselines that would manage and maintain the Amazon WorkSpaces. AWS Systems Manager also provided SoFi with the ability to easily track inventory, automate tasks, and run remote commands.

With AWS Systems Manager, SoFi was able to improve their user experience by providing users with a simple and consistent way to access their Amazon WorkSpaces. Remote sessions to troubleshoot user issues were made simpler and faster with the ability to run commands remotely. Additionally, SoFi was able to reduce operational costs with the centralized management of patches and configurations.

Getting the Most Out of AWS EUC Services with Free Workshops

AWS End User Computing (EUC) services, including Amazon WorkSpaces and Amazon AppStream 2.0, have been providing value to organizations of all sizes for nearly a decade. Customers can trust AWS to provide a secure, reliable, agile, and cost-optimized solution for use cases such as remote workers, contractors, contact center agents, VDI replacement, developers, and users running resource-intensive applications.

To help customers get the most out of their EUC services, AWS provides free workshops and webinars that highlight the key features and use cases of the service. These workshops and webinars feature technical professionals who can help customers find the best solutions to their use cases. They also provide valuable insight into how AWS EUC services can help reduce costs, improve security, and optimize performance.

KeyCore Can Help

At KeyCore, we understand the importance of cloud computing and desktop and application streaming for growing businesses. Our team of AWS certified professionals can provide you with the expertise and tools you need to optimize your cloud computing environment. We offer professional services to help you move to the cloud, as well as managed services to help you maintain and monitor your environment.

We can also provide workshops and webinars to help you understand the features, use cases, and best practices of AWS EUC. Contact us today to learn more about how we can help you get the most out of your cloud computing environment.

Read the full blog posts from AWS

AWS DevOps Blog

AWS Cloud Development Kit (AWS CDK) Reaches End-of-Support and Amazon CodeWhisperer Optimizes Software Development

AWS Cloud Development Kit (AWS CDK) Reaches End-of-Support

Since its introduction in 2018, AWS CDK has been used by developers to construct Infrastructure as Code solutions. As technology progresses, older versions of tools must reach their end-of-support. AWS CDK v1 officially reached its end-of-support on June 1, 2023. AWS will no longer support v1 and users are encouraged to upgrade to the latest version. The newer version of CDK includes new features such as support for additional programming languages, more resources, and enhanced functionality. Upgrading to the new version of CDK can help developers build better and more efficient Infrastructure as Code solutions.

Amazon CodeWhisperer Optimizes Software Development

Businesses are striving to deliver new capabilities to their customers faster. To achieve this goal, developers must leverage automation to accelerate software development and optimize code quality, performance, and compliance requirements. Amazon CodeWhisper is an AI coding companion, trained on billions of lines of Amazon and open-source code. It helps developers write code with confidence and reduce rework and errors by providing personalized recommendations for code optimization. Amazon CodeWhisperer is a powerful tool for developers and businesses who want to take their software development to the next level.

How KeyCore Can Help

KeyCore is the leading Danish AWS consultancy that provides both professional and managed services. Our team of AWS experts can help your business optimize software development using AWS CDK and Amazon CodeWhisperer. Whether you need help understanding the features available in the latest version of CDK, or would like to learn more about Amazon CodeWhisper and how it can help your development process, KeyCore can assist you. To find out more about our AWS services, please visit our website at KeyCore.dk.

Read the full blog posts from AWS

AWS for SAP

Automate High Availability Tests for SAP HANA

The software development and operations industry has been modernizing, increasingly applying DevOps as the standard approach to processes. Despite this, SAP installation and operations tend to be manually-driven. To help evolve the process to an automated approach, our first blog post demonstrated how to provision the infrastructure for SAP systems.

In this blog post, we will discuss how to automate the tests for high availability (HA) with AWS services and tools. High availability is the ability of a system to operate without interruption for a long period of time. In this context, we will explain the HA tests for SAP HANA systems and how to use AWS services and tools to automate them.

AWS Services and Tools

AWS offers a range of services and tools to help you automate your HA tests. Amazon CloudWatch is a monitoring service that can be used to monitor the health of your system and detect issues. AWS Step Functions can be used to define the steps required to run a test and automate them. AWS CloudFormation enables you to model and provision your infrastructure as code, while AWS Lambda can be used to execute the code.

SAP HANA HA Tests

The HA tests for SAP HANA systems involve simulating a system failure and validating that the system can recover from it. It is important to ensure that the system can fail over to the secondary node without any data loss. The steps for the test include:

1. Take a backup of the primary system.
2. Shutdown the primary system.
3. Restore the backup to the secondary system.
4. Validate that the secondary system is running properly.

Automating the HA Tests

Using the AWS services and tools mentioned above, it is possible to automate the HA tests. Here is an overview of the steps to do so:

1. Create a CloudFormation template to provision the infrastructure for the SAP HANA system.
2. Use Step Functions to define the workflow for the HA test.
3. Use Lambda functions to execute the code necessary for the test, such as taking a backup of the primary system and restoring it to the secondary system.
4. Use CloudWatch to monitor the systems and detect any issues.

Conclusion

High availability tests are an important part of ensuring that your SAP HANA system is running optimally and can recover from system failures. By automating the tests, you can reduce the amount of time it takes to run them and ensure that they are performed regularly.

At KeyCore, we have extensive experience with AWS and SAP, and can help you automate your HA tests. Our team of experienced consultants will work with you to develop a solution that meets your needs and requirements. Contact us today to learn more.

Read the full blog posts from AWS

Official Machine Learning Blog of Amazon Web Services

Machine Learning With Amazon Web Services

Multi-Object Tracking Solution on Amazon SageMaker

The demand for multi-object tracking (MOT) in video analysis has increased significantly in many industries, such as live sports, manufacturing and traffic monitoring. For instance, in live sports, MOT can be used to track soccer players in real-time in order to analyze physical performance such as real-time speed and moving distance. By introducing ByteTrack in 2021, Amazon Web Services (AWS) has made MOT accessible and easy to implement. Amazon SageMaker has been used to build a custom dataset and train a model for tracking multiple objects in video analysis.

Translating Documents in Real Time with Amazon Translate

Connecting with customers on a global scale is becoming increasingly important for businesses. Offering content in multiple languages in real time is a great way to expand reach. However, creating content and localizing it can be a time-consuming and costly process. Amazon Translate allows businesses to quickly and cost-effectively translate documents in real-time. With Amazon Translate, customers can easily connect with a global customer base, freeing up resources for other aspects of their business.

Scaling ML Workloads on Amazon ECS with AWS Trainium Instances

Using containers to run machine learning (ML) workloads is becoming more common. Containers can encapsulate ML training code and the entire dependency stack, right down to the hardware libraries and drivers, for consistent and portable ML development environments. Furthermore, they make scaling clusters much easier. AWS Trainium instances are designed for ML workloads, and when combined with Amazon ECS, provide an environment that is optimally suited to running ML workloads.

Host ML Models on Amazon SageMaker with Triton CV Model and PyTorch Backend

PyTorch is a popular ML framework used for applications such as computer vision and natural language processing. It is designed with Python and is easy to use, making it an increasingly popular choice. PyTorch supports dynamic computational graphs and can be used to build and train ML models. Amazon SageMaker is an ideal platform for hosting these models. Triton CV Model, which is built on PyTorch, can be used to deploy a model on SageMaker with a few simple steps.

Configure and Use Defaults With Amazon SageMaker Python SDK

The Amazon SageMaker Python SDK is an open-source library used for training and deploying ML models on Amazon SageMaker. Customers in industries such as healthcare and finance that require tight security need to ensure their data is encrypted and traffic does not traverse the internet. This is where the SageMaker Python SDK comes in. It allows customers to configure and use default settings for SageMaker resources, ensuring their ML workloads are secure.

Accelerating Learning for AWS Certification Exams With Automated Quiz Generation

Getting AWS Certified is a great way to advance your career and showcase your skills. With the help of Amazon SageMaker, preparing for an AWS Certification exam has become much easier. Automated quiz generation using SageMaker foundations models gives you the opportunity to learn, practice and test yourself before taking the exam.

Amazon SageMaker XGBoost Offers Fully Distributed GPU Training

Amazon SageMaker provides a suite of built-in algorithms, pre-trained models and pre-built solution templates to help data scientists and ML practitioners quickly get started with training and deploying ML models. Algorithms and models can be used for both supervised and unsupervised learning, and can process various types of input data. With the introduction of Amazon SageMaker XGBoost, fully distributed GPU training is now available.

Analyzing Amazon SageMaker Spend and Cost Optimization Opportunities

Cost optimization is an important aspect of the AWS Well-Architected Framework, and Amazon SageMaker is a fully managed ML service. With the help of AWS Support Proactive Services, customers can optimize their workloads, set guardrails and improve the visibility of their ML workloads’ cost and usage. This series of posts covers lessons learned about optimizing costs in different SageMaker functions such as training, hosting, notebooks and Studio, as well as Processing and Data Wrangler jobs.

High-Quality Human Feedback for Generative AI Applications With Amazon SageMaker Ground Truth Plus

Amazon SageMaker Ground Truth Plus simplifies the process of creating high-quality training datasets. All the user needs to do is share data and labeling requirements, and SageMaker Ground Truth Plus sets up and manages the data labeling workflow for them. With Ground Truth Plus, customers can access high-quality human feedback for their generative AI applications.

Let KeyCore Help You With Machine Learning On AWS

At KeyCore, we are the leading Danish AWS consultancy. We provide professional services and managed services that enable our customers to take full advantage of the AWS platform. Our expertise in AWS spans a wide range of technologies, including machine learning. We can help our customers optimize their workloads, set guardrails and gain visibility into their ML workloads’ cost and usage. Our team is ready to help you get the most out of AWS and make the most of your machine learning projects. Contact us today to learn more.

Read the full blog posts from AWS

Announcements, Updates, and Launches

AWS Launches New Services to Facilitate Database Migrations and Data Replication

AWS recently launched new services to facilitate data replication and migrations. AWS Database Migration Service (AWS DMS) was first launched in 2016 and offers a simple process for migrating databases. AWS Snow Family family devices allow customers to move data to the cloud at a low cost. The new Snowball Edge Storage Optimized devices are designed to handle large-scale data migration projects and have 210 terabytes of NVMe storage and the ability to transfer up to 1.5 gigabytes of data per second.

KeyCore Helps You Make the Most of AWS

At KeyCore, we help you make the most of AWS. With the expansive suite of services AWS has to offer, the possibilities are almost endless. Our professional and managed services can help you take advantage of AWS to ensure you get the most out of your data. With our expertise and knowledge, our team can guide you to make the best decision for your project. We also provide blog entries that can help guide you through the process of selecting and using the appropriate AWS services for your project.

Although AWS is a powerful platform, it can be difficult to navigate. With our help, you can get the most out of your data and migrate it to the cloud with ease. We can provide guidance and support during the entire process. Contact us today to learn more about our professional and managed services and how KeyCore can help you make the most of AWS.

Read the full blog posts from AWS

Containers

Pull Through Cache and Moving to Containerd on Amazon EKS

What is Pull Through Cache?

Container images are used to launch software applications in various types of environments, and they are typically stored in registries. Pull Through Cache is a caching feature that improves the speed and reliability of image pulls, and it is now available in Amazon Elastic Container Registry (ECR). This feature reduces the amount of time required to pull an image by performing a one-time pull of an image layer and storing it in the cache, instead of each container having to download the same image every time it needs it.

What is Amazon EKS?

Amazon Elastic Kubernetes Service (EKS) is a managed Kubernetes service that makes it easy for developers and DevOps teams to run and manage containerized applications. As of the Kubernetes version 1.24 release, the dockershim (an API shim between the kubelet and the Docker Engine) is deprecated in favor of supporting Container Runtime Interface (CRI) compatible runtimes. Amazon EKS has also ended support of the dockershim starting with this release.

What is containerd?

Containerd is a runtime that implements the CRI that enables Kubernetes to communicate with other container runtimes. This allows Kubernetes to support a wide variety of container runtimes, including Docker, without having to build its own implementation. It also enables more efficient resource usage, since it is possible to run multiple different runtimes within the same cluster.

How can KeyCore help?

At KeyCore, we have years of experience working with Amazon EKS and containerized applications. We can help you move from the deprecated dockershim to containerd, as well as take advantage of the new Pull Through Cache feature in Amazon ECR. We can also help you design, deploy, and manage your applications on Amazon EKS and other containerized environments. Our team of experts can offer you the advice and guidance you need to ensure that your applications are running smoothly and securely. Contact us today to learn more about how we can help you make the most of your containerized applications.

Read the full blog posts from AWS

AWS Quantum Technologies Blog

Cost Control Solution for Amazon Braket

Cost management is essential in order to succeed with your quantum computing projects. In this blog post, we introduce an Amazon Braket cost-control solution that is open-sourced on GitHub and available under an MIT license.

What is Amazon Braket?

Amazon Braket is a fully managed service that enables developers, researchers, and businesses to get started with quantum computing. It helps you explore and build quantum algorithms, test them on simulated quantum computers, and run them on different quantum hardware technologies.

Cost Control Solution

The cost control solution is designed to help developers, researchers, and businesses to better manage their Amazon Braket cost. It consists of two components: an estimator and a controller.

The estimator computes the expected cost of an Amazon Braket job before it runs. It takes into account the quantum computer chosen for the job, the number of shots, and the duration of the job.

The controller monitors the running costs of Amazon Braket jobs. It ensures that the costs do not exceed the budget set for the job. This helps to limit the cost of running quantum algorithms while ensuring that the job runs with the highest possible accuracy.

Open Sourcing the Cost Control Solution

The cost control solution for Amazon Braket is open-sourced on GitHub under an MIT license. This means that it is freely available for developers and businesses to use and improve upon.

How KeyCore Can Help

At KeyCore, we have a team of experienced AWS professionals who can help you with getting started with and managing your Amazon Braket projects. We can provide you with professional services and managed services to ensure that your quantum computing projects run smoothly and cost-efficiently. To learn more about KeyCore and our offerings, please visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

Official Database Blog of Amazon Web Services

Official Database Blog of Amazon Web Services

Modern applications have a need for fast, reliable data delivery, and Amazon Managed Streaming for Apache Kafka (Amazon MSK) is a fully-managed and secure service that makes this easy. With Amazon DocumentDB, Amazon MSK Serverless, and Amazon MSK Connect, data streaming between services is simple and reliable.

Cross-Account Amazon Aurora MySQL Migration

Migrating relational databases on Amazon Aurora MySQL-Compatible Edition from one AWS account to another can be complex for large workloads, especially as downtime must be minimized. Amazon Aurora cloning and binlog replication are useful methods for reducing downtime during the migration process.

Migrate SQL Server Databases from Azure to Amazon RDS Custom for SQL Server

This article demonstrates how to migrate from Azure to Amazon Relational Database Service (Amazon RDS) Custom for SQL Server using the native backup and restore method. In addition, it dives deep into the data-tier application backup package file. Amazon RDS Custom for SQL Server is a managed database service for legacy, custom, and third-party applications.

Model Molecular SMILES Data with Amazon Neptune and RDKit

Chemical research can be a complex process, but Amazon Neptune and RDKit offer the ability to explore chemical structures at the most fundamental level. This combination is essential for drug discovery, pharmaceutical research, and chemical engineering. With Amazon Neptune, chemical research is more efficient as it infuses machine learning and artificial intelligence.

Build Hypothetical Indexes in Amazon RDS for PostgreSQL with HypoPG

Indexes in PostgreSQL help optimize the retrieval of information from database tables and allow PostgreSQL to locate and access relevant data more quickly. HypoPG is a Postgres extension that allows users to build hypothetical indexes on specific columns and without impacting the current running database. This helps developers to review the performance impact that a particular index would have before actually creating it.

Amazon Keyspaces (for Apache Cassandra) Support for Cassandra v3.11 End of Life Schedule

Amazon Keyspaces (for Apache Cassandra) is a scalable, highly available, and managed Apache Cassandra-compatible database service. With Amazon Keyspaces, customers can run their Cassandra workloads on AWS using the same Cassandra application code and developer tools they use today. The service also supports the end of life schedule for Cassandra v3.11.

Alternatives to the Oracle Flashback Database Feature in Amazon RDS for Oracle

Customers may choose to host their Oracle database workloads in a managed service such as Amazon RDS for Oracle, but there could be workloads that have dependencies on Oracle features not supported by Amazon RDS. An example of this is the Oracle Flashback Database feature. This article outlines the alternatives to the Oracle Flashback Database feature that are available in Amazon RDS.

Cost-Effective Bulk Processing with Amazon DynamoDB

When performing bulk updates on large Amazon DynamodDB tables, it is important to consider cost. This article outlines three techniques for cost-effective in-place bulk processing with DynamoDB.

Automate the Migration of Microsoft SSIS Packages to AWS Glue with AWS SCT

When migrating Microsoft SQL Server workloads to AWS, customers may want to automate the migration and minimize changes to existing applications. AWS SCT is a useful tool for automating the migration of SQL Server workloads that use SQL Server Integration Services (SSIS) to extract, transform, and load (ETL) data.

How Twilio Modernized its Messaging Postflight Service Data Store with Amazon DynamoDB

Twilio is a customer engagement platform that drives real-time, personalized experiences for leading brands. To modernize its Messaging Postflight service data store, Twilio adopted Amazon DynamoDB due to its scalability and flexibility. With DynamoDB, Twilio was able to quickly build a distributed and fault-tolerant system to meet its customer requirements.

At KeyCore, we are experts in helping our customers migrate to and manage their AWS workloads. We provide both professional services and managed services to ensure that all customers are able to take advantage of the benefits of using Amazon Web Services. With our expertise, we can help customers maximize the performance, scalability, and cost savings of their AWS workloads. Contact us today to learn more.

Read the full blog posts from AWS

AWS Cloud Financial Management

Optimizing x86-Based Amazon EC2 Workloads for Price-to-Performance Balance

At KeyCore, we know that customers want to achieve the perfect balance between spending and performance. This post will demonstrate how x86-based Amazon Elastic Cloud Compute (EC2) workloads can be optimized to get the maximum bang for your buck. With no architectural changes required, we can help you improve your price-to-performance ratio without introducing additional engineering overhead or significant time investment.

No Application Engineering Required

These optimizations come with no application engineering needed. We will be able to do it quickly and easily. Plus, this strategy will show customers the benefit of running their x86 EC2 workloads on AMD based EC2 instances, netting at least a 10% cost savings.

Reaping the Benefits of Cost Savings

By using this optimization strategy, customers can reap the benefits of cost savings without sacrificing performance. At KeyCore, we have worked with a number of customers to help optimize their x86 EC2 workloads for maximum performance at the best possible cost. We can provide detailed advice and support on how best to implement this optimization strategy.

Maximizing Your Savings with KeyCore

At KeyCore, we offer both professional services and managed services to help our customers. We can provide AWS expertise and advice to help you maximize your savings and ensure your EC2 workloads are optimized for the best possible performance. To find out more about KeyCore and our offerings, visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Training and Certification Blog

Preparing for the AWS Certified Cloud Practitioner and AWS Certified Solutions Architect – Associate Exams

To succeed in cloud computing, having multiple AWS Certifications is becoming increasingly important. With the right preparation, it’s possible to earn the AWS Certified Cloud Practitioner and AWS Certified Solutions Architect – Associate certification exams within weeks of each other. In this blog, we’ll share 10 steps to effectively prepare for both exams simultaneously.

1. Clarify Your Learning Goals

Before beginning any preparation, take some time to clarify your learning objectives. Defining your goals will help you stay focused and achieve better results.

2. Familiarize Yourself with the Exams

Familiarize yourself with the structure of the exams and the topics that are covered. The AWS website has comprehensive information on the exams and what to expect.

3. Develop a Study Plan

Create a detailed study plan that outlines what you need to do on a daily and weekly basis. Plan for breaks and rest days, and set realistic goals to help you stay motivated.

4. Make Use of the Available Resources

The internet is full of useful resources and study materials. Use the online official exam guides and practice tests to determine your strengths and weaknesses.

5. Use AWS Services to Practice

Practice using AWS services and apply the concepts you are learning. The best way to understand the services is to use them practically.

6. Create a Network

Connect with other AWS professionals and join online communities to discuss different AWS topics. This will help you stay up-to-date with industry trends and connect with like-minded individuals.

7. Develop an Understanding of AWS Services

Develop a deep understanding of the different AWS services and how they interact with each other. This will be beneficial when you are preparing for the exam.

8. Use the AWS Documentation

The AWS Documentation is a great resource for getting detailed information about the different services. Make sure to go through the documentation for each service.

9. Take Practice Exams

Familiarize yourself with the exam format and take as many practice exams as possible. This will help you understand the kind of questions that will be asked in the actual exam.

10. Get Professional Help

If you are struggling with any of the topics, consider getting professional help. KeyCore provides AWS consulting to help companies get the most out of their cloud deployments. Our team of AWS certified professionals can help you prepare for the exams and build your skills in the cloud.

Preparing for both the AWS Certified Cloud Practitioner and AWS Certified Solutions Architect – Associate exams requires dedication and hard work. However, with a clear goal and the right resources, it is definitely achievable. KeyCore can help you get certified faster and build your skills for a successful cloud career.

Read the full blog posts from AWS

Official Big Data Blog of Amazon Web Services

How AWS Services Help Customers With Big Data Analytics Needs

BWH Hotels Scales Enterprise Business Intelligence Adoption While Reducing Costs With Amazon QuickSight

BWH Hotels, a leading global hospitality enterprise comprised of three hotel companies, is committed to delivering trusted guest experiences and driving hotel success. To do so, they needed to reduce their data costs and accelerate their enterprise business intelligence (BI) adoption. To meet these needs, BWH turned to Amazon QuickSight.

QuickSight is a fast, cloud-powered business analytics service that makes it easy to build visualizations, perform ad-hoc analysis, and quickly get insights from data. With QuickSight, BWH was able to reduce the time it took users to gain insights from their data from days to minutes. They also reduced their data costs by leveraging Amazon Athena for ad-hoc queries.

With QuickSight, BWH saw a 25% reduction in their data costs and a 5X increase in the speed of their BI adoption. KeyCore helps customers get the most out of QuickSight, with services such as data optimization, performance tuning, and dashboard design.

Migrate From Google BigQuery to Amazon Redshift Using AWS Glue and Custom Auto Loader Framework

Customers are increasingly looking for tools to make it easier to migrate from other data warehouses, such as Google BigQuery, to Amazon Redshift. To meet this need, AWS provides Glue and a Custom Auto Loader Framework.

AWS Glue is a fully managed extract, transform, and load (ETL) service that makes it easy for customers to prepare and load their data for analytics. The Custom Auto Loader Framework is a library that enables customers to use Glue to create an automated pipeline for loading data from a source, such as Google BigQuery, to a destination, such as Redshift.

The Custom Auto Loader Framework also enables customers to migrate data in batches or in real-time. Additionally, it can be used to replicate data across multiple Redshift clusters, as well as back-up and restore data to S3. KeyCore can help customers migrate from Google BigQuery to Amazon Redshift using the Custom Auto Loader Framework.

Real-Time Inference Using Deep Learning Within Amazon Kinesis Data Analytics for Apache Flink

Customers can use Amazon Kinesis Data Analytics for Apache Flink to process streaming data. The Deep Java Library (DJL) is an open-source, high-level, engine-agnostic Java framework for deep learning. With the combination of Amazon Kinesis Data Analytics for Apache Flink and DJL, customers can now build streaming applications that include real-time inference using deep learning.

Using this combination, customers can process streaming data in real-time and use the results of the deep learning inference to trigger events, trigger alerts, and take other actions. KeyCore can help customers leverage the power of DJL in conjunction with Amazon Kinesis Data Analytics for Apache Flink to get the most out of their streaming applications.

Configure Amazon OpenSearch Service for High Availability

Amazon OpenSearch Service is a fully open-source search and analytics engine that enables customers to use it for use cases like recommendation engines, ecommerce sites, and catalog search. To get the most out of Amazon OpenSearch Service, customers need to configure it for high availability.

To ensure high availability, customers should use Amazon Elastic Compute Cloud (EC2) Auto Scaling to increase the number of nodes in the cluster when the demand for indexing increases. Additionally, customers should deploy the clusters in multiple Availability Zones (AZs) to increase fault tolerance and leverage Amazon Elastic File System (EFS) for shared file storage.

KeyCore can help customers configure Amazon OpenSearch Service for high availability, as well as optimize their deployments for cost and performance.

Trakstar Unlocks New Analytical Opportunities for Its HR Customers With Amazon QuickSight

Trakstar, now a part of Mitratech, is an HR software company that serves customers from small businesses and educational institutions to large enterprises, globally. To supercharge employee performance around pivotal moments in talent development, Trakstar needed a fast and cost-effective BI solution. To meet this need, Trakstar turned to Amazon QuickSight.

With QuickSight, Trakstar was able to quickly unlock new analytical opportunities for its customers, such as tracking critical metrics, predicting employee churn, and providing custom analytics solutions. QuickSight also helped Trakstar achieve its goal of ensuring data privacy and security for its customers.

KeyCore can help customers accelerate their BI and analytics adoption with QuickSight, providing services such as data optimization, performance tuning, and dashboard design.

Join a Streaming Data Source With CDC Data For Real-Time Serverless Data Analytics Using AWS Glue, AWS DMS, and Amazon DynamoDB

Data lakes are not transactional by default; however, customers can join a streaming data source with change data capture (CDC) data for real-time serverless data analytics. To do this, customers can use AWS Glue, AWS Database Migration Service (DMS), and Amazon DynamoDB.

AWS Glue enables customers to transform and normalize streaming data, while DMS enables customers to replicate CDC data to DynamoDB. Once the CDC and streaming data are in DynamoDB, customers can use Amazon Kinesis Data Analytics for SQL Applications to join the two data sources and query the data in real-time.

KeyCore can help customers leverage the power of AWS Glue, AWS DMS, and Amazon DynamoDB to set up their real-time serverless data analytics solution.

Read the full blog posts from AWS

Networking & Content Delivery

IPv6 Day and Blue/Green Deployment on AWS

The world of cloud computing is ever-evolving, and IPv6 day is a time to celebrate these changes. Scheduled for June 8th, 2023, AWS is hosting a full day of live streamed video content for users to learn more about the transition to IPv6. Moving to IPv6 can be daunting, but Amazon’s The Routing Loop Twitch channel is offering insights to make it easier.

What is Blue/Green Deployment?

Blue/green deployment is a popular strategy used in software development that is designed to reduce the risks and downtime associated with introducing new code. The method involves running two identical environments in tandem, named “blue” and “green,” and re-directing traffic between them as needed. This allows for uninterrupted delivery of new features, as well as a fast back-out plan if there are any issues.

Achieving Zero-Downtime Deployments on AWS

Amazon CloudFront is an ideal platform for deploying blue/green deployments. This is because of its global content delivery network, which has over 175 points of presence worldwide. It also provides an easy way to switch between different versions of a web application, with seamless failover that ensures no downtime. This makes it an ideal tool for users looking to make the switch to IPv6.

Getting Help with IPv6 and Blue/Green Deployment

KeyCore – the leading Danish AWS consultancy – can help make transitioning to IPv6 and deploying blue/green code a breeze. Our experienced team of AWS professionals are knowledgeable in a range of topics, from DevOps, Infrastructure-as-Code, and Cloud Security, to AWS CloudFormation, AWS Lambda, and more. Visit our website to learn more about our services and how we can help you make the most of IPv6 Day.

Read the full blog posts from AWS

AWS for M&E Blog

AWS for M&E Blog: Enabling Charitable and Professional Streaming with Amazon IVS

AWS holds an annual charity football tournament in London, England. This year, AWS leveraged Amazon IVS for streaming the event to a worldwide audience. Simultaneously, Amazon IVS advanced channel types are now available to customers to better control costs and quality of service.

Control Quality of Service and Costs with Amazon IVS Advanced Channel Types

Amazon IVS’ advanced channel types offer users the ability to control the quality of their viewing experience and distribution costs. They come in SD and HD varieties, in addition to the original Basic and Standard channel types. Organizations can choose between a scalable and cost-effective Basic channel type, or the higher-quality Standard channel type. Both are designed to handle the unpredictable nature of peak performance.

Gain Observability of Live Streaming Workflows with AWS Elemental MediaLive and AWS Elemental MediaPackage

AWS Media Services offer users the capabilities to build both live and on-demand video workflows. Integrating scattered metrics and analyzing logs produced by AWS Media Services allow you to gain observability of your live streaming workflow. This can be done through Amazon CloudWatch metrics, AWS X-Ray, AWS Organizations, and AWS CloudTrail. All of which are designed to help you gain visibility into your media workflows.

FOX Ups Resolution, Drops Latency with AWS Powered Streaming

FOX’s cloud-based media production and delivery platform continues to pay off as evident in their production for the 2023 NFL championship game. It reached a peak viewership of 7 million viewers with a high resolution and low latency. This advanced video delivery workflow is now available across FOX’s digital sports coverage, demonstrating the power of leveraging AWS for streaming.

KeyCore and AWS Services

At KeyCore, we provide both professional and managed services to help you leverage the power of AWS. Our highly-skilled AWS professionals are experts in AWS Media Services and can help you configure and deploy a secure and cost-effective video streaming solution. From providing advice and guidance on the best services and configurations for your use case, to helping you manage and maintain your streaming workflows, KeyCore can help you every step of the way.

Read the full blog posts from AWS

Integration & Automation

Using AWS CodePipeline to Automate Your Amazon Machine Image (AMI) Builds

Amazon Machine Images (AMI) are a great way to standardize the configuration of your Amazon Web Services (AWS) workloads. However, it can be a challenge to keep up with updating your AMIs with the latest changes. In this blog post, we discuss one way you can automate the process of building your AMIs using AWS CodePipeline.

What is AWS CodePipeline?

AWS CodePipeline is a continuous integration and continuous delivery (CI/CD) service that helps you automate your software release process. It can be used to build, test, and deploy applications and infrastructure changes in an automated fashion. It can also be used to automate the process of creating AMIs.

How Does It Work?

When you set up a CodePipeline for AMI builds, you can define a source control repository, such as a Git repository, to store your AMI configurations. This means that any time you make changes to your AMI configuration, the changes will be tracked in the repository.

Next, You Set up a Build Process in AWS CodePipeline

You can then define a pipeline in CodePipeline to deploy the changes to AWS. This process will include:

  • A build stage, where the code is retrieved from the repository and compiled;
  • A test stage, where your code is tested; and
  • A deploy stage, where the changes are deployed to AWS.

The deploy stage is where you define the process for creating the AMI. You can use AWS CloudFormation or AWS CodeBuild to define the process for creating the AMI.

You can also define Triggers

You can also define triggers in CodePipeline that will automatically start the pipeline when changes are detected in your source repository. This means that any time you make changes to your AMI configuration, the changes will be automatically deployed to AWS. This eliminates the need to manually create and deploy AMIs.

Monitoring Your CodePipeline

You can also monitor your CodePipeline using Amazon CloudWatch. This will allow you to see the status of your pipeline, including any errors that occur.

KeyCore Can Help With Automating Your AWS Workloads

At KeyCore, we specialize in helping our clients automate their AWS workloads. Our team of expert AWS consultants can help you set up a continuous integration and continuous delivery pipeline to manage your AMI builds. Contact us today to find out more.

Read the full blog posts from AWS

AWS Storage Blog

AWS Storage Blog

Organizations that need to meet certain compliance frameworks, such as FISMA, FEDRAMP, PCI DSS, and SOC 2, have specific regulations for validating the security of their systems. To address this, AWS provides the Key Management Service (KMS) which allows organizations to encrypt their data-at-rest. However, to get the most out of KMS, organizations can take advantage of the S3 Bucket Keys feature to reduce their costs by up to 99%.

How AWS KMS and S3 Bucket Keys Work Together

KMS is an AWS service that allows customers to generate, rotate, and control the encryption keys used to encrypt their data-at-rest. This helps organizations keep their data secure and compliant with regulatory standards. While KMS is powerful, it does come with a cost. To help customers reduce their KMS costs, AWS offers the S3 Bucket Keys feature. This feature allows customers to create encrypted objects in S3 buckets without the need for a KMS key.

Extending Java Applications to Directly Access Files in Amazon S3

The Java programming language is a popular language in software development and many applications interact with files. However, most of these applications are written to interact with a file system based on block storage and cannot directly access files in Amazon S3. To address this, AWS provides the Java SDK which allows developers to extend their applications to directly access files in Amazon S3 without having to recompile their code.

Best Practices for Monitoring Amazon FSx for Lustre Clients and File Systems

For workloads that require high performance, such as machine learning (ML), high performance computing (HPC), video processing, and financial modelling, Amazon FSx for Lustre provides shared storage with the scalability and performance of the popular Lustre file system. To get the most out of this storage, it is important to monitor the system for performance and usage. AWS provides several best practices for monitoring clients and file systems in Amazon FSx for Lustre, including using Amazon CloudWatch for monitoring performance metrics.

Understanding Direct Network Interfaces on AWS Snow Family

Network functions (NFs), such as firewalls, intrusion detection systems, and malware protection systems, are commonly used in the telecommunications industry. To make it easier to deploy and manage these NFs, AWS provides the AWS Snow Family of devices. These devices include Direct Network Interfaces (DNIs) which are network interfaces that can be used to access multiple resources over a single network. This allows organizations to use a single network interface for all their NFs, resulting in improved security and scalability.

Performance Analysis for Different Amazon EFS Throughput Modes

To help customers determine the right throughput configuration for their file storage needs, Amazon EFS provides Throughput Modes that allow customers to specify the performance of their file system. To get the most out of these Throughput Modes, customers can use Amazon CloudWatch to monitor the performance of their workloads and determine the best configuration for their needs. This can help customers get the most out of their file storage system.

Seamlessly Map File Shares for Amazon FSx for Windows File Server with AWS Auto Scaling

For organizations that manage a fleet of Windows instances, having a central repository for files that can be accessed from multiple locations is essential. To enable this, AWS provides Amazon FSx for Windows File Server which allows customers to create file systems that are automatically mapped with the Server Message Block (SMB) protocol when users connect to the domain-joined instances. This eliminates the need to manually map file shares to hundreds of instances, saving time and improving efficiency.

How KeyCore Can Help

At KeyCore, we have a team of experienced AWS experts that can help you get the most out of AWS storage solutions, including KMS, S3 Bucket Keys, Amazon FSx for Lustre, the AWS Snow Family of devices, and Amazon EFS. We have helped many organizations reduce their costs and improve their security and scalability with our AWS consulting services. Contact us today to learn more about how we can help you.

Read the full blog posts from AWS

AWS Architecture Blog

Digital Biomarkers and Scope 3 Emissions: How AWS Can Help

Large-scale Digital Biomarker Computation

Digital biomarkers are objective, quantitative measures of physiological and behavioral data collected by digital devices for better representation of an individual’s free-living activity. This data generates large amounts of data, usually stored in different formats, that require processing. AWS serverless services can be used to compute digital biomarker data more efficiently and cost-effectively.

AWS serverless services allow the computation of large-scale digital biomarker data with no upfront costs, giving users the flexibility to scale depending on their needs. The serverless architecture does not require upfront provisioning or managing of servers, and the services are charged based on the resources used. The services are also built with security and compliance in mind, allowing for secure and compliant storage of digital biomarker data.

Managing Data Confidentiality for Scope 3 Emissions

Scope 3 emissions are indirect greenhouse gas emissions that come from activities outside those directly controlled or owned by a company. Measuring these emissions requires gathering data from external sources, like suppliers and transportation providers. One of the main challenges with Scope 3 emissions is data confidentiality and security.

AWS Clean Rooms are a serverless service that allows users to securely process confidential data from external sources. The service provides a secure environment where data is encrypted and securely stored. This allows users to analyze data from external sources without exposing confidential or sensitive information.

How KeyCore Can Help

KeyCore is the leading Danish AWS consultancy, providing professional and managed services for AWS users. KeyCore’s team of experienced consultants can help customers navigate the complexities of AWS and ensure that their digital biomarker and Scope 3 emissions data is securely stored and processed. Our team can also help customers optimize their serverless architecture to ensure that resources are used efficiently and cost-effectively. We can also provide guidance on utilizing AWS services that are tailored to the individual customer’s needs.

Read the full blog posts from AWS

AWS Partner Network (APN) Blog

Revolutionizing User Experiences Through Chatbots and AWS Textract

Chatbot technology is quickly revolutionizing customer experiences, offering businesses a way to provide immediate and curated responses without the need for additional manpower and resources. Insurance companies, in particular, can benefit from this technology, as it is used to streamline and make the claim process more efficient for customers and processing agencies alike.

Using social media and Amazon Textract, the claim process can be automated and simplified for any type of expense. Textract is a service designed to extract and analyze text and data from documents, images, and PDFs. By leveraging the power of Textract, businesses are able to make the process of reading essential documents for claim processing much more efficient.

The Contino Sustainability Dashboard and Optimizing Carbon Footprints

Organizations now understand that their carbon footprint must be taken into account across their products’ supply chain in order to improve energy efficiency. To solve this issue, Contino created the Sustainability Dashboard, an open-source tool that was designed to help customers visualize their carbon footprint within their AWS environment.

The Sustainability Dashboard provides customers with the ability to identify their energy usage and find ways to optimize their resource utilization, resulting in a lower carbon footprint.

The Mendix Low-Code Platform and the Public Sector

Public sector agencies must be able to keep up with the rapid pace of innovation in order to remain competitive. Mendix, a global leader in enterprise low-code, has partnered with the public sector to provide a low-code software development platform that is capable of doing just that.

This platform is equipped with a robust ecosystem, allowing public sector customers to quickly and easily deploy and manage digital government initiatives. By leveraging the power of Mendix, customers are able to reduce the barriers associated with traditional deployment models and innovate faster than ever.

LeapLogic and Automated Cloud Transformation

Impetus Technologies created LeapLogic, a cloud transformation accelerator, to help businesses modernize with ease. Through intelligent pattern-based transformation, LeapLogic is capable of eliminating variables during the process of transforming legacy business code, logic, and workloads to AWS. This allows customers to quickly migrate their legacy data and analytic software to AWS in an automated environment.

Amazon SageMaker Pipelines and Machine Learning

When migrating on-premises MLOps to Amazon SageMaker Pipelines, Mission Cloud was able to effectively build a workflow for model development to production, all while accelerating their customer’s computer vision model production process. SageMaker Pipelines is an effective workflow orchestration tool that provides CI/CD capabilities for ML pipelines.

Flutura’s Cerebra and Asset Optimization

Flutura’s solution, Cerebra, was designed to help energy and process manufacturers predict asset breakdowns before they actually occur. By utilizing a strong data foundation and collecting and processing data from a variety of assets, Cerebra is able to mitigate unplanned downtime, optimize asset performance, and improve asset reliability.

Kiteworks and the Private Content Network

Kiteworks delivers a Private Content Network (PCN) to organizations, allowing them to unify, track, control, and secure the private information interchanged between their trusted partners. Kiteworks makes use of AWS for infrastructure provisioning, data protection, and the automation of its PCN.

Accelerating Amazon QuickSight With Automation

Amazon QuickSight is a powerful analytics business intelligence tool that provides valuable insights. Experts from AWS and Quantiphi, an AWS Premier Tier Services Partner, have discussed an approach to help users leverage QuickSight as an enterprise-wide BI tool. AWS CodePipeline can be used to automate the entire process, from creating data sources, datasets, and analysis to building dashboards.

Accenture Future Talent Platform and Cloud Adoption

Accenture created the Future Talent Platform, a cloud-native SaaS learning platform designed to enable users to succeed in a fast-paced environment. This platform enables general or highly focused training, guiding customers, partners, and employees on how to get the most out of new technologies or programs.

Rocketlane and Customer Onboarding on AWS

Rocketlane was able to build its customer onboarding SaaS platform on AWS. Using the platform’s broad tools and services, Rocketlane was able to innovate quickly and go-to-market with AWS Marketplace and AWS Activate for startups.

KeyCore and AWS Solutions

KeyCore is the leading Danish AWS consultancy, providing professional and managed services. They are highly advanced in AWS and can provide the highest level of expertise to clients.

At KeyCore, they understand the power of leveraging the capabilities of the cloud for businesses. They offer a range of services, from helping you select the best cloud solutions for your organization to developing custom solutions tailored to your needs. With their advanced knowledge and experience in the AWS cloud, KeyCore can help you get the most out of your cloud journey.

Read the full blog posts from AWS

AWS Cloud Enterprise Strategy Blog

KeyCore: Supercharge Your Decision Making with AWS Tenets

From Ralph Waldo Emerson’s famous quote “Once you make a decision, the universe conspires to make it happen”, we know that decisions are fundamental in the realization of our goals. Even in our daily lives, we make countless decisions about food, clothing, and other mundane tasks.

As decision making plays a pivotal role in the success of any venture, we must ask ourselves: how can we make better decisions? While there is no definite answer, one approach that has been gaining a lot of traction is the use of AWS Tenets. In this blog post, we explore how AWS Tenets can be used to supercharge decision making.

What are AWS Tenets?

AWS Tenets are a set of core values and principles that form the foundation of the entire AWS Cloud Platform. AWS Tenets are used to ensure that data is secure, applications are reliable, and that systems are resilient and scalable.

The Tenets are divided into three major categories. The first is Security and Compliance, which includes measures such as authentication and authorization, encryption and identity management. The second is Reliability and Resilience, which focuses on fault-tolerance and disaster recovery. The last is Performance and Efficiency, which covers areas such as scalability and cost optimization.

How Can We Leverage AWS Tenets to Supercharge Decision Making?

When making a decision, it’s vital to consider the risks and benefits of each solution. AWS Tenets can help to make this process easier by providing a set of criteria that can be used to compare different solutions. By looking at the criteria in each Tenet, you can determine the risks and benefits of each solution and make an informed decision.

For example, let’s say you’re trying to decide between two different cloud providers. You can use the Performance and Efficiency Tenet to compare the cost-effectiveness of both solutions. You can use the Security and Compliance Tenet to compare the security measures of each provider. And you can use the Reliability and Resilience Tenet to compare the fault-tolerance and disaster recovery capabilities of each provider.

By applying the criteria laid out in the Tenets, you can make a more informed decision and determine which solution offers the best balance of risks and rewards. This can help to ensure that your decisions are aligned with your business’s goals and that you’re making the right choice for your organization.

KeyCore: Helping You Make the Right Decision with AWS Tenets

At KeyCore, we understand the importance of making the right decision. Our team of experienced AWS consultants can help you assess the risks and benefits of each solution and make an informed decision. We can also help you implement the Tenets in your organization, ensuring that your data is secure and your systems are reliable and resilient.

To learn more about how KeyCore can help you leverage AWS Tenets to supercharge your decision making, contact us today. Our team is here to help you make the right decisions for your business and ensure you’re getting the most out of the AWS Cloud Platform.

Read the full blog posts from AWS

AWS HPC Blog

Streamline Distributed Machine Learning Workflows on AWS Batch with Covalent

Developing distributed machine learning (ML) workloads can be time-consuming and challenging. This is especially true when dealing with multiple steps and resources. To this end, Covalent is an open-source orchestration tool that helps streamline the deployment of distributed workloads on AWS resources like AWS Batch. In this post, we will explain the key concepts of Covalent and demonstrate how to create an ML workflow on AWS Batch in a few simple steps.

What Is Covalent?

Covalent is an open-source orchestration tool designed to help ML engineers build and deploy distributed workloads on AWS. It simplifies the process of creating complex ML workflows on the cloud, allowing users to focus on their data and the quality of their ML models. Covalent blends standard Python (or any other language of choice) with the distributed processing of AWS Batch, allowing users to quickly spin up compute resources for their workflows.

How Does Covalent Work?

Covalent allows ML engineers to create ML workflows using a modular approach. By breaking down the ML workflow into a set of modular tasks, Covalent allows developers to focus on the individual tasks that make up the ML workflow. The individual tasks can be executed in parallel or sequentially depending on the requirements of the workflow. Additionally, Covalent allows users to define rules for task execution, such as retrying a task if it fails, or skipping a task if certain conditions are met. This allows users to efficiently manage the complexity of their ML workflows.

Creating an ML Workflow With Covalent and AWS Batch

Let’s take a look at an example of how to create an ML workflow with Covalent and AWS Batch. To start, you will need to define the steps for your ML workflow. Each step in the workflow will need to be wrapped in a Python function and a list of parameters. You can then define the dependencies between the steps, and how each step should be executed. Once your ML workflow is defined, you can then use Covalent to deploy it to AWS Batch. Covalent will create the necessary AWS Batch job definitions and compute resources to run the ML workflow.

Once your ML workflow is deployed, Covalent will monitor the execution of your ML workflow and provide insights into the progress of the workflow. You can use this information to monitor the progress of your ML workflow and adjust the compute resources as needed. Additionally, you can use Covalent to track the performance of your ML models and view the results of the ML workflow.

The Benefits of Streamlining ML Workflows With Covalent and AWS Batch

The combination of Covalent and AWS Batch allows ML engineers to quickly and efficiently create and deploy distributed ML workflows on the cloud. Covalent provides an intuitive way to develop and deploy ML workflows, while AWS Batch provides the compute resources to run the ML workflow. Additionally, Covalent provides insights into the progress of the ML workflow and the performance of the ML models, allowing ML engineers to quickly identify and address any problems that arise.

How KeyCore Can Help

At KeyCore, we understand the challenges that come with developing ML workflows on the cloud. Our team of AWS experts can help you set up distributed ML workflows with Covalent and AWS Batch quickly and efficiently. We can also provide guidance on how to optimize your ML workflows for performance and scalability. Contact us today to learn more about how our team of experts can help you streamline your ML workflows.

Read the full blog posts from AWS

AWS Cloud Operations & Migrations Blog

Using Amazon CloudWatch for NGINX Log Analysis, Synthetics Canary Management, A/B Testing with AWS IoT Core and Least Privilege Access to Private EC2 Instances in Mergers and Acquisitions

Analyzing NGINX Logs with CloudWatch
Customers build, deploy, and maintain web applications on AWS, often using NGINX as their application server to handle the large number of requests. With Amazon CloudWatch, customers can monitor response times, uptime, and more to ensure performance. CloudWatch Contributor Insights can be used for general analysis of NGINX logs.

Managing CloudWatch Synthetics Canaries at Scale
Amazon CloudWatch Synthetics offers automated monitoring of application endpoints, REST APIs, and website content to discover issues before customers do. As the number of applications and canaries increase, it becomes difficult to manage them at scale. Automating the scaling of canaries can help manage this complexity.

A/B Testing and Dark Launches with AWS IoT Core and CloudWatch
In order to create value for customers, companies are rapidly developing IoT applications. Scheduled updates can be pushed to IoT devices, but feature flags also allow dormant code to be activated. A/B testing and dark launches can be performed on these applications using AWS IoT Core and Amazon CloudWatch.

Granting Least Privilege Access to Private EC2 Instances
In certain situations, access to private EC2 instances needs to be granted to external third-parties. AWS Systems Manager Session Manager provides a secure way to do so, without having to open inbound ports and maintain bastion hosts. With the correct combination of AWS services, least privilege access can be provided.

Managing Technical Diversity and Migration Capability in Mergers and Acquisitions
When merging organizations, technical diversity and migration readiness need to be managed in order to ensure cohesion and success. This blog post discusses considerations when it comes to assessing, mobilizing, and operationalizing different organizations. AWS provides mechanisms to help throughout these phases.

At KeyCore, we use our extensive experience and knowledge of AWS to guide our customers through every stage of their journey. We provide consulting and managed services to create tailor-made solutions for our customers’ needs. Our team can help map out the migrations and experiments that need to take place for a successful merger or acquisition, helping ensure successful integration and continued success. Contact us today to find out more.

Read the full blog posts from AWS

AWS for Industries

AWS for Industries

Executive Conversations: Building the Brain Knowledge Platform, with Shoaib Mufti, Data and Technology Lead at the Allen Institute for Brain Science

Shoaib Mufti, Head of Data and Technology at the Allen Institute for Brain Science, joins Lisa McFerrin, Worldwide Lead for Research, Discovery, and Translational Medicine at Amazon Web Services (AWS). The conversation centers around how the Allen Institute is using the cloud to build the Brain Knowledge Platform (BKP) for the U.S. National Institutes of Health (NIH). They discuss how data technology and the cloud have enabled the Allen Institute to bring together multiple data sets, such as genomic data, neural data, and imaging data, to create the BKP, which is an easily accessible, interactive platform for neuroscience and AI researchers to develop insights into the brain. Through this platform, the Allen Institute is also making their data sets open to the public.

How businesses can gain ecommerce capabilities to increase sales

The recent pandemic has led to a shift in customer behavior, resulting in an increase in ecommerce and digital channels. Small and medium sized businesses are facing losses due to the lack of customers in physical retail stores. To combat this, businesses need to focus on digital channels and provide mobile options for customers. Mobile adoption translates to fewer calls to the front desk and concierge, which frees up staff time to perform higher value tasks. Amazon Web Services (AWS) provides businesses with the technology and resources they need to quickly set up their ecommerce capabilities.

Credit Card Payment Processing on AWS

The Financial Services Industry (FSI) is undergoing a significant transformation, with electronic payments playing a key role. AWS provides the tools needed to process payments, including fraud protection measures, customer authentication management, and data security. AWS is also compliant with Payment Card Industry Data Security Standard (PCI DSS) and supports major credit card processing solutions. Additionally, AWS offers the potential to reduce costs by eliminating the need for physical payment cards and machines.

Drive Hotel Mobile Adoption with Conversations by NLX

Today’s fast-paced world requires businesses to keep up with mobile technology. Hotels are no exception, as guests want mobile options. Mobile adoption means less workload for front desk and concierge staff, as well as an improved customer experience. AWS offers Conversations by NLX, a powerful chatbot service that helps hotels build conversational interfaces to provide a better customer experience. This allows customers to access information and make requests through voice or text, 24/7.

How Polymathian uses Amazon ECS Anywhere to optimize underground mine production in near real time

Operating an underground mine is a complex task. Amazon Web Services (AWS) provides solutions to help. Polymathian’s Mine Monitor is a cloud-based Mine Production Optimization System (MPOS) that provides near real-time data and analytics for underground mine operations. AWS ECS Anywhere is used to deploy the MPOS in customer sites, regardless of where they are located. This ensures a secure connection between Polymathian’s cloud infrastructure and the customer’s data center, while allowing customers to control their computing and storage resources.

The Retail Race: A Roadmap for Implementing a Smart Store Strategy

Retailers are in a race to provide an exceptional customer experience, and physical stores must keep up with digital trends. To do this, retailers need to focus on digital, mobile, self-service, and contactless technologies. AWS provides retail solutions, such as Amazon Connect, Amazon Rekognition, and Amazon SageMaker, to help retailers build a smart store strategy. These services can help retailers reduce costs, drive customer engagement, and provide a better customer experience.

Highlights from the AWS Life Sciences Executive Symposium 2023: Unlocking access to and insights from data

At the AWS Life Sciences Executive Symposium, the topic of unlocking access to and insights from data was discussed. 300 life sciences executives from 100 organizations attended the event to discuss how they can drive innovation through data and machine learning (ML). AWS offers a range of services to help life sciences organizations process and analyze data, such as Amazon Athena, Amazon EMR, and Amazon SageMaker.

Highlights from the AWS Life Sciences Executive Symposium 2023: Accelerating Pharma Drug Discovery with Machine Learning

The AWS Life Sciences Executive Symposium also featured a track on accelerating pharma drug discovery with ML. Executives from 100 organizations discussed how they can drive innovation through robust data foundations and machine learning. AWS offers services such as Amazon Comprehend Medical, Amazon Personalize, and Amazon SageMaker that help pharmaceutical companies accelerate drug discovery and streamline development.

Backed by the Cloud, Telcos Are Realizing Serious Gains

Cloud technology is helping telcos in Europe optimize their networks, reduce costs, boost sustainability and resiliency, and power innovative services. AWS provides services such as Amazon Connect, Amazon EC2, and Amazon SageMaker that can help telcos move their systems to the cloud. This can reduce costs, increase scalability, and enable telcos to provide better customer experiences.

Manufacturing Optimization for the Electronics Industry: How to Accelerate Product Development and Drive Engineering Efficiency with Instrumental Inc. on AWS

The development and production of electronic devices is becoming increasingly complex. Instrumental Inc., built on AWS, helps electronics manufacturers optimize their production. This can help manufacturers reduce costs, increase production speed, and provide better customer experiences. AWS offers services that can help, such as Amazon ECS, Amazon Athena, and AWS Lambda.

How KeyCore Can Help

KeyCore can help businesses of all sizes take advantage of the solutions provided by AWS. Our team of experts can provide an end-to-end solution for businesses, from platform setup to ongoing maintenance. We can also help businesses build and implement custom solutions that meet their unique needs. KeyCore’s experience and expertise make us the perfect choice for businesses looking to take their operations to the cloud.

Read the full blog posts from AWS

The latest AWS security, identity, and compliance launches, announcements, and how-to posts.

The Latest AWS Security, Identity, and Compliance Launches

New eBook: 5 Keys to Secure Enterprise Messaging

AWS is excited to announce the launch of their new eBook, 5 Keys to Secure Enterprise Messaging. This eBook provides best practices to address security and compliance risks associated with messaging apps. Currently, 3.09 billion mobile phone users access messaging apps and this number is projected to grow to 3.51 billion by 2021. The eBook covers topics such as authentication, encryption, and data loss prevention to ensure secure messaging.

Announcing the AWS Blueprint for Ransomware Defense

AWS has introduced the AWS Blueprint for Ransomware Defense to help both enterprise and public sector organizations to protect data from ransomware events. The Blueprint is a mapping of AWS services and features that align to aspects of ransomware defense such as backup and disaster recovery, incident response and forensics, and access control and identity management.

Updated Whitepaper Available: Architecting for PCI DSS Segmentation and Scoping on AWS

AWS has re-published their whitepaper Architecting for PCI DSS Segmentation and Scoping on AWS to provide guidance on how to define the scope of PCI Data Security Standard (DSS) workloads running in the AWS Cloud. This whitepaper has been updated to include AWS best practices and includes detailed instructions on how to apply segmentation to various PCI DSS workloads.

AWS Security Profile: Ritesh Desai, GM, AWS Secrets Manager

In the AWS Security Profile series, AWS thought leaders provide insights on data protection, cloud security, and secrets management. In this post, Ritesh Desai, General Manager of AWS Secrets Manager, shares his thoughts on these topics. He also outlines how AWS services can help organizations detect, mitigate, and protect against data threats.

Get Custom Data into Amazon Security Lake through Ingesting Azure Activity Logs

Amazon Security Lake automatically centralizes security data from cloud and on-premises sources into a purpose-built data lake. This post shows how to configure the Amazon Security Lake solution with cloud activity data from Microsoft Azure Monitor activity logs. It also provides guidance on how to set up the necessary resources to ingest Azure activity logs into Amazon Security Lake.

Amazon Security Lake is Now Generally Available

Amazon Security Lake is now generally available, after being first announced at 2022 re:Invent. Security Lake centralizes security data from AWS environments, SaaS providers, on-premises, and cloud sources into a purpose-built data lake stored in the AWS account. It also allows customers to monitor and audit their security posture, detect threats, and respond to security incidents.

At KeyCore, we have extensive experience with AWS services and can help you to develop and implement strategies and solutions for secure enterprise messaging and ransomware defense. Our team of AWS certified professionals can provide guidance in architecting for PCI DSS segmentation and scoping, setting up Amazon Security Lake, and more. Contact us today to learn more about how we can help you achieve your security and compliance goals.

Read the full blog posts from AWS

AWS Startups Blog

Transforming Cancer Care with C2i Genomics on AWS

Healthcare and life sciences (HCLS) startups are leveraging the power of technology to make a big impact on human health. C2i Genomics is one of these startups, founded in 2019, and they are paving the way for improved cancer monitoring and detection. Using AI and ML solutions built with AWS, C2i Genomics’ platform is able to analyze sequenced genome data with a simple blood test to detect the tumor burden of cancer patients.

Learning the Keys to Startup Success with Guild CFO Chris Garber

The role of the CFO in a startup is a critical one, as they are responsible for navigating the relationship between technical leaders, CTOs, and engineering teams. To help CFOs better understand and enable this, Chris Garber, CFO of Guild, has provided his perspectives on the role of a startup CFO in the series “The Evolving Role of the Startup CFO”. Chris believes that lifelong learning is the key to success, and encourages CFOs to remain flexible and engaged in order to support and contribute to the progress of the startup.

KeyCore Can Help Your Business Take Advantage of AWS

At KeyCore, the leading Danish AWS consultancy, we specialize in helping businesses take advantage of AWS and the powerful solutions it offers. Whether you are a healthcare and life sciences startup looking to leverage AI and ML solutions or a startup CFO looking for guidance on the evolving role, our team of experienced professionals can provide the professional and managed services you need. Get in touch with us today to learn more about what we can do for your business!

Read the full blog posts from AWS

Front-End Web & Mobile

Access Data with AWS AppSync and Amazon Timestream

AWS AppSync is a fully managed, serverless GraphQL API service that simplifies application development by providing a single endpoint to securely query or update data from multiple databases, microservices, and APIs. Many organizations across different industry verticals, such as health care, manufacturing, energy generation, and transportation, use a stream of data to improve efficiencies of business and create better customer experiences.

Enabling Access to Amazon Timestream

By using AppSync to access Amazon Timestream, organizations can benefit from a scalable, queryable time-series database to make more informed decisions and provide a more modern data experience for their end users. AppSync enables developers to quickly build queries for data in Timestream, as well as perform updates and mutations in the database. AppSync also allows developers to build real-time data applications that can be connected to Amazon Kinesis, AWS Lambda, or other services.

Benefits of Amazon Timestream

Amazon Timestream is a fast, scalable, serverless, and cost-effective time-series database service for IoT, DevOps, and other applications that store and analyze data over time. It is designed to handle trillions of events per day and automatically manage the storage and retrieval of data based on user-defined retention policies. With Timestream, developers can easily query and analyze their data using SQL.

By leveraging AppSync, organizations can use Timestream to quickly store, query, and analyze data from a variety of sources, such as IoT sensors, web applications, or mobile devices. This helps organizations to make more informed decisions and create customer experiences that are tailored to individual customer needs.

How KeyCore Can Help

KeyCore is a leading Danish AWS consultancy that provides both professional services and managed services. We specialize in helping organizations set up and use AWS AppSync and Amazon Timestream. Our team of experienced AWS professionals can help you get the most out of your data, by integrating AppSync with Timestream and providing guidance on query optimization and other best practices.

Read the full blog posts from AWS

Innovating in the Public Sector

Innovating in the Public Sector

The AWS Summit Washington, DC in 2023 is a great opportunity for the public sector to get together to explore the potential of cloud computing technology. Attendees can learn how cloud technology can drive culture change, digital transformation, and infrastructure modernization. To make the most of the event, attendees should prepare by familiarizing themselves with topics such as Amazon IVS, which can be used to host town hall events, and AWS Disaster Response, which provides solutions for hurricane response efforts.

Amazon IVS for Turnkey Town Halls

Amazon IVS is a great solution for nonprofit organizations that need to provide their members and beneficiaries with information that they can access from anywhere. Live town hall events can help organizations get their message out and engage with their members. This walkthrough shows how Amazon IVS can be used to build a turnkey live streaming platform that integrates into an existing website.

Allyson Fryhoff and the Power of Technology for Good

Allyson Fryhoff is the managing director of nonprofit and nonprofit healthcare business at AWS. In this conversation, Allyson discusses her life’s purpose, who inspires her, and how the power of technology can be used for good. She also talks about her favorite Amazon Leadership Principle and how AWS supports public health professionals with cloud training pathways.

Hurricane Season 2023 and AWS Disaster Response

AWS Disaster Response is already working to help organizations and communities respond to hurricanes before the season begins. Throughout the year, AWS Disaster Response develops and tests new innovations that use cloud technology to enable more efficient disaster response capabilities.

Share Our Strength Overcomes Data Management Challenges with AWS

Share Our Strength, a national nonprofit organization, is dedicated to ending hunger and poverty in the US and abroad. They used AWS to overcome data management challenges and improve their strategic planning outcomes. This helped them to end childhood hunger in America by working with community organizations and providing funding, technical assistance, and resources.

Innovating to Future-Proof Water Resources

Islands can demonstrate how digital technologies can help address issues such as water scarcity. This article highlights islands that are using the art of the possible to support a clean and steady supply of water.

Raising the Bar on Accessibility for Open-Source Public Sector Solutions

Performance Dashboard on AWS is an open-source solution designed by AWS experts to help organizations build, deploy, and maintain custom dashboards. To make sure the application was accessible, the Government Transformation Team enlisted the help of the UK’s Digital Accessibility Centre and LevelAccess to conduct accessibility audits.

KeyCore Can Help

KeyCore is the leading Danish AWS Consultancy, providing both professional services and managed services. Our team of AWS experts can help with any of the topics discussed in this blog post. We can help organizations build, deploy, and maintain custom dashboards, leverage digital technologies to address water scarcity, provide cloud training pathways, and more. Contact us today to learn more.

Read the full blog posts from AWS

The Internet of Things on AWS – Official Blog

FleetWise and Kinesis: Leverage Object Storage and Enriched Data with AWS

Today, AWS IoT FleetWise makes it easier and more cost-effective for automotive customers to create and manage data pipelines from their vehicles with its support for object storage in Amazon Simple Storage Service (S3). This allows customers to select where vehicle data is persisted in the cloud, such as for further data processing or visualization.

Meanwhile, Kinesis Data Firehose helps to ingest enriched Internet of Things (IoT) data into Amazon S3. Enriched data may not exist in the device payload due to factors such as minimizing device payload size, and Firehose provides the necessary functionality to supplement the available data.

Advantages of Leveraging Object Storage and Enriched Data with AWS

FleetWise and Kinesis provide a powerful combination of object storage and enriched data usage. This combination can provide many advantages, such as quickly and easily obtaining insights from data streams. By using S3 buckets, customers can store data in a secure, cost-effective way and then access it anytime for further analytics purposes.

In addition, Firehose can stream data into S3 buckets directly from the source, allowing customers to quickly and easily ingest enriched data and analyze it in an organized manner. This is especially helpful when dealing with large amounts of data since Firehose can handle large throughputs without any manual intervention.

KeyCore and Leveraging Object Storage and Enriched Data with AWS

At KeyCore, we offer expert services and managed services to help customers get the most out of their AWS solutions. Our team of experts is versed in all aspects of AWS, including FleetWise and Kinesis, and can help customers deploy and manage their data pipelines in the most efficient manner.

Our team can also advise customers on the best ways to leverage object storage and enriched data with AWS. We can help customers optimize their architecture for performance, scalability, and security, as well as ensure that their data pipelines are running smoothly and efficiently.

If you’re looking for a way to leverage object storage and enriched data with AWS, KeyCore can help. Contact us today to learn more about how we can help you get the most out of your AWS solutions.

Read the full blog posts from AWS

AWS Open Source Blog

How Traveloka Uses Backstage as an API Developer Portal for Amazon API Gateway

Traveloka is one of the largest online travel companies in Southeast Asia, and they use open source Backstage as their developer portal for APIs hosted on Amazon API Gateway. This article will explain how Traveloka uses Backstage, what makes Backstage an ideal solution for Amazon API Gateway, and how KeyCore can help to implement a similar solution.

What is Backstage?

Backstage is an open source platform for developers to manage, deploy, and monitor their software. It is built on top of the Kubernetes ecosystem and provides a unified developer experience. It allows developers to quickly and easily deploy their applications to Amazon Web Services, Microsoft Azure, and other cloud providers. Backstage also provides an API portal, where developers can securely store and manage their APIs. The portal also includes an API explorer, where developers can explore and interact with their APIs.

How Does Traveloka Use Backstage?

Traveloka uses Backstage to manage their APIs hosted on Amazon API Gateway. They have created a custom portal, which provides an API explorer for their developers to interact with their APIs. The portal also provides a way to store and manage documentation, as well as access control for their APIs.

What Makes Backstage Ideal for Amazon API Gateway?

Backstage’s API portal is well-suited for Amazon API Gateway because it provides an easy way to manage and monitor APIs. With Backstage, developers can easily configure their APIs and check the performance of their APIs in real-time. Developers can also configure access control for their APIs to ensure that only authorized users can access them. Additionally, with Backstage, developers can quickly and easily deploy their APIs to Amazon Web Services and other cloud providers.

How Can KeyCore Help?

KeyCore is a leading Danish AWS consultancy. We provide professional services and managed services to help clients get the most out of their AWS infrastructure. With our expertise in AWS, we can help clients implement a Backstage-based developer portal for their Amazon API Gateway APIs. Our team of experts can help clients configure their APIs, configure access control, deploy their APIs, and monitor their APIs in real-time.

Read the full blog posts from AWS

Scroll to Top