Summary of AWS blogs for the week of monday Mon Oct 30

In the week of Mon Oct 30 2023 AWS published 104 blog posts – here is an overview of what happened.

Topics Covered

Desktop and Application Streaming

Optimize Remote and Hybrid Work with Amazon AppStream 2.0 and Amazon WorkSpaces Web

To support the modern workforce, organizations need to enable remote and hybrid work for the long term. With Amazon AppStream 2.0 and Amazon WorkSpaces Web, organizations can give end users access to their desktops and applications from anywhere, using any device, while keeping costs low. At re:Invent 2023 in Las Vegas, you can learn more about how to best optimize application streaming costs.

SaaS Security with Amazon WorkSpaces Web

Organizations today must be ready for the flexible and agile work environments that are becoming more popular. Business professionals expect instant productivity, regardless of their time or location. They also need best of breed applications as well as secure access. Amazon WorkSpaces Web offers organizations the opportunity to extend their on-premise environment and provide secure access to applications and data over the internet.

Automating Management of Amazon WorkSpaces and Amazon AppStream 2.0

Automating key processes is essential for managing Amazon WorkSpaces and Amazon AppStream 2.0 effectively at scale. This builders’ session at re:Invent 2023 in Las Vegas will provide participants with the knowledge and skills to use scripts and other workflows to manage WorkSpaces and AppStream 2.0. You can also learn to overcome the common challenges associated with these solutions.

At KeyCore, we are dedicated to helping organizations in their journey with Amazon AppStream 2.0 and Amazon WorkSpaces Web. With our managed services, you can ensure that your organization can manage and automate Amazon WorkSpaces and AppStream 2.0 effectively. Our professional services give you the guidance and support to make the most of the scalability, security, and productivity that come with these solutions. Contact us to learn more.

Read the full blog posts from AWS

Official Machine Learning Blog of Amazon Web Services

Amazon Web Services and Machine Learning: A Summary

Bundesliga Match Facts Shot Speed – Who fires the hardest shots in the Bundesliga?

Soccer shots that leave spectators in awe are often characterized by incredible power and speed. As it turns out, Amazon Web Services (AWS) has been using its machine learning (ML) technology to measure just how powerful those shots are. AWS used its ML technology to analyze data generated by the Bundesliga, and build models to predict shot speed. This model was then applied to the data from the prior season, as well as the current season, to track shot speed and other relevant metrics.

Deploy ML models built in Amazon SageMaker Canvas to Amazon SageMaker real-time endpoints

Amazon SageMaker Canvas now supports deploying machine learning (ML) models to real-time inferencing endpoints. This makes it easier for analysts and citizen data scientists to generate accurate ML predictions for their business needs and take ML models to production, allowing businesses to take action based on ML-powered insights. Using Amazon SageMaker, KeyCore can help businesses deploy their ML models to production.

Develop generative AI applications to improve teaching and learning experiences

Generative AI models, in particular large language models (LLMs), have been used to improve teaching and learning experiences. They have enabled AI to have a greater impact on education, speeding up the process of integrating AI into the classroom. With the help of Amazon SageMaker JumpStart, KeyCore can help businesses develop generative AI applications and incorporate them into their curriculum.

Dialogue-guided visual language processing with Amazon SageMaker JumpStart

Visual language processing (VLP) is a key part of generative AI, driving advancements in multimodal learning. By using large language models (LLMs), contrastive language-image pre-training (CLIP), and multimodality data, visual language models (VLMs) are able to perform tasks such as image captioning and question answering with greater accuracy. Amazon SageMaker JumpStart can help businesses build VLP models and leverage them for their needs.

How Reveal’s Logikcull used Amazon Comprehend to detect and redact PII from legal documents at scale

Personally identifiable information (PII) is sensitive data present in emails, videos, PDFs, and more. To protect this data, companies need to detect and redact PII from documents. Reveal’s Logikcull used Amazon Comprehend to accomplish this process at scale, automatically detecting and redacting PII from documents. With the help of Amazon Comprehend, KeyCore can help businesses detect and redact PII from their documents.

Schneider Electric leverages Retrieval Augmented LLMs on SageMaker to ensure real-time updates in their ERP systems

Schneider Electric leveraged Retrieval Augmented LLMs on SageMaker to ensure real-time updates in their ERP systems. ERP systems are used to store and manage data from all aspects of the business, and if any data changes, it must be updated in the ERP system. By leveraging Retrieval Augmented LLMs on SageMaker, Schneider Electric was able to ensure that their ERP system was always up to date. KeyCore can provide services to help businesses similarly leverage Retrieval Augmented LLMs on SageMaker.

Use AWS PrivateLink to set up private access to Amazon Bedrock

Amazon Bedrock is a fully managed service provided by AWS, providing developers with access to foundation models (FMs) and the tools to customize them for their specific applications. With the help of AWS PrivateLink, developers can set up private access to Amazon Bedrock. KeyCore can help businesses configure AWS PrivateLink to set up private access to Amazon Bedrock.

Deploy and fine-tune foundation models in Amazon SageMaker JumpStart with two lines of code

The simplified Amazon SageMaker JumpStart SDK allows developers to deploy and fine-tune foundation models in just two lines of code. This makes it easier to get started with using foundation models in production. With the help of the simplified Amazon SageMaker JumpStart SDK, KeyCore can help businesses get started with using foundation models and deploying them to production.

Read the full blog posts from AWS

Announcements, Updates, and Launches

Amazon EC2 Capacity Blocks for ML and AWS Weekly Roundup – October 30, 2023

Recent advancements in Machine Learning (ML) have enabled customers of all sizes and industries to innovate and transform their businesses. However, the demand for GPU capacity to train, fine-tune, experiment, and infer ML models has outpaced the industry-wide supply, making GPUs a scarce resource. Amazon EC2 Capacity Blocks for ML now offers customers the ability to reserve GPU capacity for ML workloads.

In addition, this week’s AWS Weekly Roundup brings news of re:Post Selections, SNS and SQS FIFO improvements, multi-VPC ENI attachments, and more. AWS re:Post provides access to a community of professionals discussing the latest AWS news and product releases. SNS and SQS FIFO now allow customers to order and deduplicate message delivery, making it easier to design distributed systems. Multi-VPC ENI attachments enable customers to attach an Elastic Network Interface (ENI) to multiple Virtual Private Clouds (VPCs), enabling a single ENI to span multiple VPCs.

At KeyCore, we help companies of all sizes take advantage of AWS services and solutions. Our experienced team of AWS-certified professionals is knowledgeable in all aspects of AWS, from design and implementation to cost optimization and automation. We provide professional services, including design and architecture consultation, migration and onboarding, and managed services for operating AWS resources. If you need help taking advantage of Amazon EC2 Capacity Blocks for ML or any of the other new AWS releases, contact KeyCore and let us help.

Read the full blog posts from AWS

Containers

Containers: An Overview of Amazon ECS, EKS and Karpenter

Amazon Elastic Container Service (ECS) is a container orchestration service that simplifies the process of managing application containers on Amazon Web Services (AWS). ECS is designed to provide 24/7 visibility and management of your containers, with built-in support for automatic task health and task replacement for unexpected changes.

When used in tandem with Amazon’s Elastic Kubernetes Service (EKS), users can build multi-tenant JupyterHub platforms on AWS for data analytics and machine learning workloads that can scale reliably and securely.

Finally, the recently released Karpenter is a Kubernetes node lifecycle manager created to minimize cluster node configurations. Since its initial release, Karpenter has seen tremendous growth, with over 4900 stars on GitHub and code contributions from over 200 developers.

KeyCore is well-positioned to provide professional and managed services for businesses looking to leverage Amazon ECS, EKS and Karpenter. Our team has a deep understanding of AWS, and our experienced consultants can help you get up and running quickly and efficiently. We can also help you optimize your existing infrastructure to get the most out of these powerful tools. Contact us today to learn more.

Read the full blog posts from AWS

AWS Quantum Technologies Blog

Exploring the Ground State of the Anti-Ferromagnetic Ising Spin-Chain on a 1D Lattice with PennyLane

Quantum simulation of spin systems is one of the major applications of quantum computing. With the PennyLane-Braket SDK plugin, users can study the ground state of the anti-ferromagnetic Ising spin-chain on a one-dimensional (1D) lattice on the Aquila quantum processor, a neutral-atom quantum computer available on-demand via the AWS Cloud.

What Is PennyLane?

PennyLane is an open-source machine learning library used to develop and train quantum machine learning models. It provides a Python-based interface to simulate and optimize quantum circuits, enabling users to explore quantum computing and quantum machine learning. The library is supported by Amazon Braket, an AWS service that makes it easy to get started with quantum computing.

What Is the Ising Spin-Chain?

The Ising spin-chain is a physical system of coupled spins, which can be thought of as particles with two possible orientations. The orientation of the spins can be thought of as representing binary values – “up” or “down” – which can interact and influence one another in a chain. This system has been studied for decades by researchers in physics due to its strong connection to many-body physics and the statistical mechanics of phase transitions.

Using PennyLane to Simulate the Ising Spin-Chain on the Aquila Quantum Computer

Using the PennyLane-Braket SDK, users can simulate the Ising spin-chain on the Aquila quantum computer. This allows them to study the ground state of the anti-ferromagnetic Ising spin-chain on a 1D lattice. The project includes an example notebook that explores the ground state of a spin-chain with different lengths and anti-ferromagnetic interactions. This enables users to explore the properties of the system as well as its various phases.

How KeyCore Can Help

At KeyCore, we are experts in Amazon Web Services (AWS) and are well-versed in the many services provided by the cloud platform. Our team can help you understand the many services available on AWS and develop custom solutions to fit your specific needs. With our expertise in quantum computing, we can help you explore and understand the possibilities offered by the Aquila quantum processor and the PennyLane-Braket SDK. Contact us today for more information.

Read the full blog posts from AWS

Official Database Blog of Amazon Web Services

Amazon Web Services Official Database Blog

Set up Amazon CloudWatch Alarms on Amazon RDS Enhanced Monitoring OS Metrics

Database administrators, application teams, and architects need the right level of visibility into database health indicators to proactively eradicate performance issues before they affect users or cause an outage. Amazon Relational Database Service (Amazon RDS) offers monitoring tools, such as Enhanced Monitoring, to get such insights. Enhanced Monitoring provides users with operating system (OS) metrics for instances running MySQL, MariaDB, and PostgreSQL. With Amazon CloudWatch alarms, you can receive notifications when specified conditions are met. This allows for proactive action to be taken when needed.

Implementing Run-Time Write Synchronization in Amazon Aurora Global Database

Amazon Aurora is a MySQL and PostgreSQL-compatible relational database built for the cloud. Aurora’s Global Database feature allows a single Amazon Aurora database to span multiple AWS Regions. Global Database provides high-performance global access to databases with the ability to synchronize writes across all Regions. This blog post will show you how to enable write synchronization for a Global Database, as well as how to enable the Aurora Replica Lag Monitoring feature to monitor replication lag.

Migrate from Oracle to Amazon RDS for MySQL, MariaDB or Amazon Aurora MySQL Using Oracle GoldenGate

Modernizing an Oracle Database to an open-source database on AWS involves various factors, such as database conversion or refactoring efforts, enterprise-wide decisions, and target database selection. AWS provides several options for target databases, such as Amazon Relational Database Service (Amazon RDS) for MySQL, MariaDB, and PostgreSQL, or Amazon Aurora. Oracle GoldenGate is a tool that can be used to migrate from Oracle to any of these AWS databases. This blog post will explain how to use Oracle GoldenGate to migrate from Oracle to Amazon RDS for MySQL, MariaDB, or Amazon Aurora MySQL.

Build a Generative AI-Powered Agent Assistance Application Using Amazon Aurora and Amazon SageMaker JumpStart

Generative AI is a form of artificial intelligence designed to generate content, such as text, images, video, and music. With the Amazon Aurora and Amazon SageMaker JumpStart, businesses can harness the power of generative AI to remain competitive. Foundation models, a form of generative AI, generate output from one or more inputs (prompts). This blog post will explain how to build a generative AI-powered agent assistance application using Amazon Aurora and Amazon SageMaker JumpStart.

Mask PII Data Using AWS DMS and Amazon Macie During Migration

Migrating sensitive data from an Amazon Relational Database Service (Amazon RDS) for Oracle production source database to an RDS for Oracle development target database requires identifying and masking PII data. This blog post will show you how to identify PII data using Amazon Macie, mask it using AWS Database Migration Service (AWS DMS), and migrate it before releasing the environment to users.

Amazon Timestream for Amazon Connect Real-Time Monitoring

Amazon Connect is an easy-to-use cloud contact center solution that helps companies deliver superior customer service at a lower cost. Connect has many real-time monitoring capabilities. Amazon Timestream can be used to create a custom solution that meets your real-time monitoring requirements. This blog post will explain how to use Amazon Timestream for Amazon Connect real-time monitoring.

Powering Amazon RDS with AWS Graviton3: Benchmarks

In April 2023, AWS announced the ability to power Amazon Relational Database Service (Amazon RDS) instances with the AWS Graviton3 processor. This blog post will explain how to power Amazon RDS with the AWS Graviton3 processor and provide benchmarks.

Use Amazon DynamoDB Incremental Export to Update Apache Iceberg Tables

Amazon DynamoDB is a fully managed, serverless, key-value NoSQL database designed to run high-performance applications. DynamoDB recently launched a new feature: Incremental export to Amazon Simple Storage Service (Amazon S3). This blog post will explain how to use incremental exports to update your downstream systems regularly using only the changed data.

Techniques to Improve the State-of-the-Art in Cloud FinOps Using Amazon Neptune

Cloud computing has changed almost every business and industry. With the Cloud, businesses no longer need to plan for and procure servers and other IT infrastructure weeks or months in advance. This blog post will explain how to use Amazon Neptune to employ techniques to improve the state-of-the-art in Cloud FinOps.

How Power Utilities Analyze and Detect Harmonics Issues Using Power Quality and Customer Usage Data with Amazon Timestream: Part 2

This blog post series will show you how to use an Amazon Timestream database and its built-in time series functionalities to interpolate data and calculate the correlation between customer energy usage and power quality issues. This blog post will explain how to build a power quality analysis Proof of Concept (PoC) using Amazon Timestream.

Impactful Features in PostgreSQL 15

PostgreSQL is one of the most popular open-source relational database systems. This blog post will explain the impactful features in PostgreSQL 15.

Detect and Fix Low Cardinality Indexes in Amazon DocumentDB

Amazon DocumentDB (with MongoDB compatibility) is a fully managed native JSON document database. It is important to create indexes to improve query performance. This blog post will explain how to detect and fix low cardinality indexes in Amazon DocumentDB.

KeyCore’s Expertise

At KeyCore, we have the expertise to help you set up, manage, and monitor your databases in AWS. From setting up Amazon CloudWatch alarms to detect database health indicators to helping you migrate from Oracle to Amazon RDS, we have you covered. We can also help you build a generative AI-powered agent assistance application using Amazon Aurora and Amazon SageMaker JumpStart and detect and fix low cardinality indexes in Amazon DocumentDB. Contact us today to learn more about how we can help.

Read the full blog posts from AWS

AWS for Games Blog

Leveraging Pixel Streaming in AWS

Pixel streaming is a method for encoding and streaming graphical content from the cloud for interactive applications. With the help of AWS, developers can take advantage of the latest computing technology with on-demand cost structure and robust hosting environment.

Benefits of Using AWS for Pixel Streaming

Using AWS for pixel streaming has many benefits for developers. AWS offers a range of processors, including Intel, AMD, ARM, and Nvidia, and a cost structure that requires minimal upfront investment. This allows developers to access computing power when they need it with minimal cost.

Additionally, AWS provides a secure development and hosting environment for developers. This secure environment provides protection from threats such as malicious code, cyberattacks, and data loss. Furthermore, AWS offers a wide range of services and tools to help developers build and deploy their applications with ease.

Pixel Streaming Considerations

When using AWS for pixel streaming, there are a few considerations to take into account. The first is network latency. It is important to choose an appropriate network for streaming content, as the latency of the connection can affect the quality of the streaming experience. Additionally, it is important to ensure the bandwidth of the connection is sufficient for streaming content.

It is also important to ensure the encoding settings for streaming content are optimized for the target platform. Encoding settings can have a major impact on the quality of the streaming experience. The developer should also consider the type of streaming protocol to use, as different protocols can have different tradeoffs in terms of latency, bandwidth, and quality.

KeyCore’s Professional Services

At KeyCore, we provide a range of professional services to help developers optimize their pixel streaming applications on AWS. Our experienced professionals can help with network selection, encoding settings optimization, protocol selection, and more. We can also provide custom solutions for any project, as well as provide ongoing support for existing applications.

To learn more about our professional services, please visit our website at https://www.keycore.dk. Our experienced professionals are here to help you take advantage of the latest computing technology with the help of AWS.

Read the full blog posts from AWS

AWS Training and Certification Blog

Get Ready for the AWS Certified Data Engineer – Associate beta exam!

AWS recently announced the release of the AWS Certified Data Engineer – Associate beta exam. This exam is available to take between November 27, 2023 and January 12, 2024, at a discounted rate of USD 75.

Become an AWS Certified Data Engineer – Associate

A professional who holds the AWS Certified Data Engineer – Associate certification has the knowledge and skills to design and build data engineering solutions on the AWS platform. This certification helps prove that the individual is knowledgeable about cloud data engineering principles, and can use them to build and maintain cloud data engineering solutions.

Retiring AWS Certified Data Analytics – Specialty Exam

Along with the launch of the Associate-level certification, AWS will be retiring the AWS Certified Data Analytics – Specialty exam on April 9, 2024. This exam will no longer be available after this date, so if you wish to receive this certification, you must take the AWS Certified Data Analytics – Specialty exam before this date.

Get Certified with KeyCore

At KeyCore, we offer both professional services and managed services for AWS certification. We have a team of AWS experts who are highly experienced in the AWS platform and can help you achieve your certification goals. With our knowledgeable and experienced team, you can be sure that you have the best chance of passing your certification exam and becoming an AWS Certified Data Engineer – Associate.

Read the full blog posts from AWS

Microsoft Workloads on AWS

Microsoft Workloads on AWS

Configuring Amazon Time Sync for Microsoft Active Directory

Configuring Microsoft Active Directory (AD) to use the Amazon Time Sync Service for time synchronization can be accomplished with the help of Group Policy Objects (GPOs). To ensure the time synchronization health of the domain is always up to date, users can utilize Amazon CloudWatch and Amazon Simple Notification Service.

By configuring Amazon Time Sync, users can have their Amazon EC2 instances and on-premise AD Domain Controllers in the same time zone. AD Domain Controllers use Windows Time Service (W32Time) to synchronize the time, and this can be configured to use the Amazon Time Sync Service as the source of time.

Optimizing Protocol Selection for Microsoft SQL Server and Amazon FSx for NetApp ONTAP

Amazon FSx for NetApp ONTAP (FSx for ONTAP) offers two storage access protocols, iSCSI and SMB, for users to choose from. It’s important to understand the advantages and disadvantages of both protocols when using Microsoft SQL Server on Windows.

The iSCSI protocol can offer high performance for applications that require high throughput and low latency. This is the best choice for Microsoft SQL Server. On the other hand, the SMB protocol provides more flexibility when configuring network security settings, with many supported options such as Kerberos, NTLM, and SMB signing.

However, most of the limitations of the SMB protocol come down to performance. SMB is not recommended for applications that require high throughput and low latency.

Keycore’s Assistance with Microsoft Workloads on AWS

KeyCore can help customers get the most out of their Microsoft workloads on AWS. Our experienced engineers have the expertise to provide both professional services and managed services to ensure that customers are able to configure their Amazon Time Sync Service, monitor and alert on the time synchronization health of the domain with Amazon CloudWatch and Amazon Simple Notification Service, and select the optimal storage access protocol for their specific scenario. Contact KeyCore today to learn more about how we can help you with Microsoft workloads on AWS.

Read the full blog posts from AWS

Official Big Data Blog of Amazon Web Services

Overview of AWS Big Data Solutions for Data Pipelines and Model Versioning

As companies grow, they require better visibility into their data operations and analytics to make informed decisions and stay competitive. Amazon Web Services (AWS) provides a range of solutions to help businesses improve their data pipelines, monitor their usage, and implement model versioning to gain powerful insights into their data.

AWS Glue ETL Job Monitoring and Alarm Setting

AWS Glue is a fully managed ETL (extract, transform, and load) service that makes it easy to prepare and load data for analytics. To help customers monitor their AWS Glue usage and gain visibility into operational efficiency, AWS provides a way to deploy Amazon QuickSight dashboards to set alarms. This helps customers track and analyze Glue job metrics in real-time and make informed decisions.

AWS Graviton2 on Amazon EMR Serverless

GoDaddy, an online platform that helps entrepreneurs succeed, leveraged AWS Graviton2 on Amazon EMR Serverless to get up to 24% better price-performance for their Spark workloads. With Amazon EMR Serverless, users pay only for the resources they consume. Additionally, Amazon EMR Serverless allows users to easily provision and scale their clusters with a few clicks.

Amazon Redshift ML Model Versioning

Using Amazon Redshift ML, users can train machine learning models using SQL. Redshift ML also provides features like automatic model training and model versioning, which allows users to create multiple versions of their model and rollback to previous versions. This is especially useful in scenarios where users want to experiment and compare different models without compromising the original model.

Multi-AZ Deployments for Amazon Redshift Data Warehouse

Multi-AZ deployments for Amazon Redshift are now generally available. This feature provides data redundancy and improves availability for mission-critical workloads. In the event of a failure, Amazon Redshift automatically fails over to the standby node, ensuring that the data warehouse continues to function and remain available to users.

Snowflake with Amazon MWAA to Orchestrate Data Pipelines

Using Snowflake with Amazon Managed Workflows for Apache Airflow (MWAA), users can orchestrate data pipelines across different applications and services. This is especially useful for companies that are dealing with data from different sources. Snowflake with MWAA also allows users to set up data pipelines that require complex logic.

Spark on AWS Lambda

Spark on AWS Lambda (SoAL) is a framework that runs Apache Spark workloads on AWS Lambda. It is designed to handle data payloads of all sizes, from 10 KB to 400 MB, and is suitable for both batch and event-based workloads. Using SoAL, users can set up their own Spark workloads on AWS Lambda and benefit from the scalability and cost-efficiency of serverless computing.

KeyCore – Helping Customers Take Advantage of AWS Big Data Solutions

At KeyCore, our mission is to help customers get the most out of their data. Our team of AWS experts can help you utilize the advanced big data solutions provided by AWS, such as AWS Glue, Redshift ML, and Spark on AWS Lambda, to build robust, secure, and cost-effective data pipelines. We also provide managed services to ensure your data pipelines are running optimally.

Read the full blog posts from AWS

Networking & Content Delivery

Networking & Content Delivery with AWS Global Accelerator, AWS Gateway Load Balancer, and AWS Transit Gateway

IPv6 Support for Network Load Balancer (NLB) Endpoints

AWS Global Accelerator now offers support for routing IPv6 traffic directly to dual-stack Network Load Balancer (NLB) endpoints. This feature enables customers to use dual-stack NLB endpoints behind dual-stack accelerators and achieve end-to-end IPv6 connectivity. Setup instructions are provided, along with considerations customers should keep in mind when using this feature.

Cross-Account Support

AWS Global Accelerator now offers cross-account support, allowing customers to use a single accelerator to route traffic across multiple AWS accounts. This feature provides customers with the flexibility to select application endpoints located in different AWS accounts and route traffic to them through a single accelerator. Benefits of this feature are discussed in detail.

Centralized Internet Ingress with Experian

Experian, a global technology company offering credit risk, fraud, targeted marketing, and automated decisioning solutions, has used AWS Gateway Load Balancer and AWS Transit Gateway to establish centralized Internet ingress. This solution provides Experian with the ability to control ingress and egress traffic in a secure, cost-effective, and scalable manner.

QoS Observability for OTT Streaming using Amazon CloudFront

This blog discusses Quality of Service (QoS) in the OTT world and why it is so important. With the widespread availability of high-speed internet and an expanding range of streaming devices, maintaining a good QoS for OTT content is a challenge for content providers. Amazon CloudFront can help content providers meet these challenges.

Hybrid Cloud Architectures using AWS Direct Connect

AWS Direct Connect has increased several quota limits to enable customers to create hybrid cloud architectures. These changes allow customers to create up to four Transit Virtual interfaces (VIFs) per AWS Direct Connect dedicated connection and increase the maximum number of prefixes to 200 for each VIF.

KeyCore’s Offering

At KeyCore, we offer both professional services and managed services to help customers with networking and content delivery on AWS. Our team of experienced engineers and AWS-certified professionals can assist you in setting up and managing AWS Global Accelerator, AWS Gateway Load Balancer, and AWS Transit Gateway, as well as other AWS services. Contact us today to learn more.

Read the full blog posts from AWS

AWS Compute Blog

Leveraging Webhooks on AWS To Innovate

Webhooks are a popular way for applications to communicate, and for businesses to collaborate and integrate with customers and partners. With Amazon S3, customers can create a workload with event-driven architecture (EDA) and use Amazon S3 Event Notifications or Amazon EventBridge to take advantage of webhooks on AWS.

Orchestrating Dependent File Uploads with AWS Step Functions

Amazon S3 is an object storage service that many customers use for file storage. With the use of Amazon S3 Event Notifications or Amazon EventBridge customers can create workloads with event-driven architecture (EDA). This architecture responds to events in real-time and allows customers to take action on data. EDA is the foundation for leveraging webhooks on AWS, as it enables customers to build applications that send and receive webhooks.

When receiving webhooks, customers can use AWS Step Functions to orchestrate dependent uploads. Step Functions is a serverless orchestration service that enables customers to build complex state machines to coordinate multiple steps in a single workflow. This allows customers to build applications that take action on the webhook data received, such as triggering an Amazon S3 file upload, and to handle errors and retries of failed tasks.

For sending webhooks, customers can also use AWS EventBridge. EventBridge is a serverless event bus that allows customers to easily connect different applications together to send and receive events. EventBridge makes it easy to build webhooks on AWS and to integrate applications quickly and securely.

The Benefits of Leveraging Webhooks on AWS

Using webhooks on AWS can provide businesses with many advantages. By leveraging webhooks on AWS, businesses can easily integrate their applications with external services, allowing them to quickly build and deploy applications that are responsive to external events. This can help businesses to innovate faster and to better respond to customer needs.

In addition, using webhooks on AWS can also help businesses to save time and money. AWS services are designed to be highly scalable and cost-effective, so businesses can quickly build and deploy applications without having to worry about infrastructure and costs. By leveraging webhooks on AWS, businesses can focus on their core applications and take advantage of the power of the cloud without having to worry about scaling or maintaining infrastructure.

KeyCore Can Help

At KeyCore, we are experts in leveraging webhooks on AWS. Our experienced team of AWS consultants can help customers to architect and build their applications that send and receive webhooks. We can also help customers to get the most out of AWS by providing advice and guidance on best practices for leveraging webhooks on AWS.

Read the full blog posts from AWS

AWS for M&E Blog

Using Autodesk VRED with AWS Thinkbox Deadline 10.3 and optimizing Ticketmaster services with AWS

Autodesk VRED is a powerful tool that offers high-quality rendering and streaming of complex digital assets. With the release of AWS Thinkbox Deadline 10.3, AWS partnered with Nissan Design Europe and Rivian to provide users with a more efficient and improved experience in Deadline’s support of Autodesk VRED. These updates allow Deadline to offer a wide range of rendering capabilities, including support for render job submission with Autodesk VRED, support for multiple render nodes, and more.

Unified, Serverless Data Stream

Avid fans of any sports team, celebrities and performers appreciate the journey that starts with an online ticket purchase. Ticketmaster, a global ticket marketplace, offers a personalized experience to connect fans with the events and services they love. Recently, Ticketmaster has leveraged AWS technologies to enhance its services with a unified, serverless data stream. This includes using AWS Lambda, Amazon API Gateway, and Amazon DynamoDB to process ticket orders and deliver personalized offers to customers in real-time.

How KeyCore Can Help

At KeyCore, we are advanced in AWS and provide both professional and managed services. We use the latest AWS technologies to streamline businesses, ranging from media and entertainment to ticketing and retail. Our team of experts can help you implement Autodesk VRED with AWS Thinkbox Deadline 10.3 and help you optimize your ticketing services with Amazon Lambda, API Gateway, and DynamoDB. To learn more about KeyCore and our offerings, please visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Storage Blog

Optimizing and Monitoring AWS Storage Costs with Amazon EFS and S3 Event Notification

Organizations of all sizes should always be seeking to optimize their resource utilization, striving for the highest efficacy at the lowest possible cost. To make effective decisions to support this goal, it’s essential to have the relevant data, the best tools for generating reports, and clear documentation on how to do so.

Amazon Elastic File System (Amazon EFS)

Amazon Elastic File System (Amazon EFS) is a cloud-based storage solution that offers serverless scalability and increased costs efficiency for enterprise-grade applications. With features like easy-to-use encryption and support for multiple operating systems, Amazon EFS provides a secure, reliable, and cost-effective way to manage files and data.

To help organizations better monitor and control their Amazon EFS costs, Amazon provides a set of tools and services. Amazon CloudWatch and Amazon CloudWatch Logs are used to generate and collect metrics about the storage usage of Amazon EFS. These metrics can then be used to set up alarms, create detailed reports, and analyze usage patterns. Additionally, Amazon EFS’s billing calculator can be used to estimate storage costs, and customers can also use the Amazon EFS cost explorer to visualize and analyze their costs over time.

Amazon S3 Event Notification

Amazon S3 Event Notifications can be used to build applications that communicate and trigger between decoupled services. Data events are changes in state, or updates to, data. For certain use cases such as batch order processing or content management, customers may need to implement application logic to handle duplicate and out-of-order events. To support these use cases, Amazon S3 provides a feature called Amazon S3 Event Ordering.

Amazon S3 Event Ordering ensures that events are delivered in the order they were written to the bucket, and that duplicate events are not sent. This feature prevents applications from being overwhelmed by duplicate events, or errors from being thrown due to out-of-order events. To enable this feature, customers can use the Amazon S3 console, AWS CLI, or AWS SDK to configure event ordering for a specific event type.

KeyCore: Helping With AWS Storage Costs Optimization and Monitoring

At KeyCore, we specialize in providing professional and managed services for AWS customers. We offer expertise and guidance to help you utilize AWS storage solutions, like Amazon EFS and S3, to optimize your storage costs. Our team of AWS-certified experts can help you get the most out of your storage resources, so you can save time and money. Contact us today to learn more about how we can help you with your AWS storage optimization and monitoring needs.

Read the full blog posts from AWS

AWS Developer Tools Blog

The Ruby community has seen a drop in support for Ruby runtimes 2.3 and 2.4, and starting November 24, 2023, AWS SDK for Ruby version 3 will no longer support these end of life (EOL) Ruby runtime versions. The EOL dates for these versions are Ruby 2.3 – EOL began on 2019-03-31 and Ruby 2.4 – EOL began on 2020-03-31.

What this Means for Developers

For developers, this means that any applications relying on Ruby runtimes 2.3 or 2.4 will need to be migrated to newer versions of Ruby before November 24, 2023. This change increases the importance of regularly checking for updates to Ruby and other programming languages, as outdated versions can quickly become unsupported.

KeyCore’s Support

At KeyCore, the leading Danish AWS consultancy, we provide professional and managed services to help developers with this issue. Our extensive expertise in AWS allows us to help our clients migrate applications to newer versions of Ruby and other programming languages. For more information about KeyCore and our services, please visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Architecture Blog

Using Containers and Cell-Based Design for Higher Resiliency and Efficiency

In our journey towards cloud-native architectures, we’ve discussed how to make them more scalable, secure, and cost-effective. In this post, we will take a deeper dive into two strategies to further improve the efficiency and resiliency of our architectures: containerizing our applications and using cell-based design.

Containers

Containers are isolated environments that package an application along with all its dependencies, libraries, and configurations. This offers several benefits, from reproducibility and scalability to cost savings. For example, each container instance runs as an isolated environment on the same OS, meaning it can be moved around to different hosts without the need for additional software or configuration. This also means that each container instance is optimized for the specific application it runs, eliminating any wasted resources.

Cell-Based Design

Cell-based design is a method of building systems and applications with a focus on self-healing, auto-scaling, and resiliency. It is based on the idea of creating distinct, isolated “cells”, or microservices, that can be deployed independently, and that can be scaled up or down as needed. This allows for improved scalability, since each cell can be scaled independently, and improved resiliency, since each cell can be isolated and updated separately.

KeyCore Services

At KeyCore, we have experts in both containerization and cell-based design. We can help you plan, design, and implement a cloud-native architecture tailored to your business needs. Our team of AWS certified professionals can develop custom solutions and offer managed services to help you get the most out of your cloud platform. Contact us today to learn more about how we can help.

Read the full blog posts from AWS

AWS Partner Network (APN) Blog

The Benefits of Using AWS Partner Network (APN) for Cloud Infrastructure

Amazon Web Services (AWS) has been a leader in cloud computing for many years. As a result, AWS has created an extensive partner network (APN) that provides customers with a wide range of solutions to help meet their infrastructure needs. This post will look at some of the key benefits of using APN for cloud infrastructure.

Cross-Account Deployment with GitLab Pipelines

AWS Cloud Development Kit (CDK) is a powerful tool that allows customers to develop infrastructure as code in their preferred language, such as JavaScript/TypeScript, or Python. This post provides a reference framework for customers that can save them time when implementing GitLab Pipelines using AWS CDK for a secure and reliable deployment experience across their teams. Many organizations use GitLab as a CI/CD platform for their cloud infrastructure and application deployments on AWS.

Simplify, Accelerate, and Automate SAP Deployment with AWS Launch Wizard

The installation of highly available, scalable, and reliable SAP systems is a critical consideration for customers launching greenfield, brownfield, or bluefield SAP projects. AWS Launch Wizard for SAP provides a guided wizard-based experience that simplifies and accelerates the deployment of SAP systems based on HANA database. Deloitte is a key partner in testing beta features for AWS Launch Wizard for SAP and has provided significant feedback to enhance the service features.

Using AWS Service Catalog with HashiCorp Terraform Cloud

Customers use AWS Service Catalog to create and manage a catalog of IT services and products approved for use on AWS. This post shows how customers can use AWS Service Catalog Engine for Terraform Cloud to provision their products and benefit from a self-service provisioning model. End users have access to a pre-validated catalog of infrastructure with governance enforced through Terraform Cloud features such as team permissions, run tasks, and policy sets.

Enriching Snowflake Data with Amazon Location Service and AWS Lambda

Location intelligence involves integrating geospatial data into the broader business intelligence and decision-making process. On AWS, customers can use the Snowflake Data Cloud to integrate fragmented data, discover and securely share data, and execute diverse analytic workloads. This post shows how customers can enrich their existing Snowflake data with location-based insights using Amazon Location Service for location intelligence workloads.

Filter and Stream Logs from Amazon S3 Logging Buckets with AWS Lambda

This post showcases a way to filter and stream logs from centralized Amazon S3 logging buckets to Splunk using a push mechanism leveraging AWS Lambda. The push mechanism offers numerous benefits such as lower operational overhead, lower costs, and automated scaling. A sample Lambda code is provided that filters VPC flow logs with “action” flag set to “REJECT” and pushes it to Splunk via a Splunk HTTP Event Collector (HEC) endpoint.

Ensuring Safe and Sustainable Autonomous Vehicle Validation with AWS

TCS Mobility Suite Smart Validation powered by AWS enables sustainable, end-to-end simulation-based validation for automated driving systems. It allows for testing of autonomous vehicles in a low-cost, risk-free manner while helping improve the quality of the software as well as accelerating time to market. Smart Validation is able to reduce billions of redundant test cases to hundreds, saving computational resources and prioritizing the testing of edge cases.

Integrating Amazon Cognito with 1Kosmos BlockID for Enhanced Security and User Experience

Multi-factor authentication (MFA) enhances security for web and mobile applications by requiring additional identification methods other than a password. To provide a frictionless user experience with MFA, a variety of authentication options are required that support a range of users and devices. This post shows how customers can use 1Kosmos BlockID and Amazon Cognito to balance security with usability when building customer facing applications.

How KeyCore Can Help

KeyCore provides professional and managed services that help customers get the most out of the AWS APN. Our experts are highly skilled in cloud infrastructure and can help with the setup and implementation of AWS services like AWS CDK and AWS Launch Wizard. We also provide advanced support for AWS Service Catalog and AWS Lambda, and can help customers ensure secure and sustainable autonomous vehicle validation with AWS. Finally, KeyCore can help customers implement location-based insights with Amazon Location Service and integrate Amazon Cognito with 1Kosmos BlockID for enhanced security and user experience.

Read the full blog posts from AWS

AWS HPC Blog

How Urban Planners Benefit from AWS Batch and Digital Twins

Urban planners need to explore the impact of green infrastructure on the urban environment using simulations. With AWS Batch, they can now quickly scale their simulations with Green Urban Scenarios simulator (GUS).

Accelerating Simulations with GUS and AWS Batch

GUS helps urban planners analyze the impact of green infrastructure on urban systems, with digital twins and simulations. By leveraging AWS Batch, they can quickly scale their simulations to explore different scenarios and plan their projects.

AWS Batch provides a powerful set of features that make it easy for urban planners to create batch jobs and monitor their progress. With support for Amazon EC2 Spot Instances and AWS Fargate, urban planners can get the most cost-effective compute resources for their simulations, without having to manage the underlying infrastructure.

Deploying Self-Calibrating Digital Twins on AWS

Digital twins can be difficult to maintain, as real-world systems degrade and change over time. With TwinFlow on AWS, urban planners can now deploy self-calibrating digital twins that can detect and respond to changes in the environment.

TwinFlow combines analytics, machine learning, and cloud technology to create digital twins that are continuously monitored and calibrated using operational data. By automating the calibration process, TwinFlow eliminates the need for manual intervention and enables urban planners to deploy digital twins quickly and efficiently.

How KeyCore Can Help

At KeyCore, we specialize in providing professional and managed services to help urban planners leverage the power of AWS Batch and TwinFlow to create and deploy digital twins and simulations for green-urban planning. Our team of AWS Certified Solutions Architects and DevOps Engineers have extensive experience in helping urban planners use AWS to create digital twins and simulations that can help them plan projects more efficiently.

Contact us today to learn more about how KeyCore can help you leverage the power of AWS Batch and TwinFlow to accelerate green-urban planning simulations and deploy self-calibrating digital twins.

Read the full blog posts from AWS

AWS Cloud Operations & Migrations Blog

AWS re:Invent 2023: Monitoring and Observability, and Centralized Operations Management

AWS re:Invent 2023 is an annual cloud computing conference in Las Vegas from Nov 27 to Dec 1. 96 sessions are available across all solution areas, from monitoring and observability to centralized operations management. These topics help customers get the most out of the AWS cloud.

Amazon CloudWatch Dashboards

Amazon CloudWatch enables customers to collect monitoring and operational data in the form of logs, metrics, alarms, and events. With this data, customers can set up CloudWatch dashboards in order to have an unified view of their workloads. Additionally, they can access operational health data more easily.

Amazon EC2 Auto Scaling

As customers migrate legacy workloads to AWS Cloud, they may need to rehost or replatform applications to Amazon EC2 servers. To take advantage of the scalability of the cloud, customers can use Amazon EC2 Auto Scaling Groups to scale EC2 servers up or down on demand and on schedule.

Amazon Connect Real-Time Monitoring

Amazon Connect is a cloud contact center solution that helps companies of any size provide superior customer service at a lower cost. It has many real-time monitoring capabilities. For requirements that go beyond these, Amazon Connect provides data and APIs that customers can use to implement their own monitoring solutions.

AWS Mainframe Modernization

Attendees of AWS re:Invent 2023 can join the exciting lineup of mainframe-related sessions. These sessions will teach attendees about how to modernize mainframes with AWS, how to automate migration and operations, and how to use advanced analytics to get the most out of mainframes.

Scaling GitHub with AWS

Customers that migrate on-premises enterprise applications to AWS can migrate GitHub to AWS as well. By running GitHub on AWS, teams can collaborate more efficiently by taking advantage of the scalability of the cloud.

Reducing MTTR with Amazon CloudWatch and AWS X-Ray

When running microservice-based workloads in a serverless environment, customers frequently have issues with troubleshooting incidents since the necessary data can be distributed across multiple components. To reduce the mean time to resolution (MTTR), Amazon CloudWatch and AWS X-Ray can be used.

Successfully Delivering Cloud Migrations

Despite the many benefits of moving to the Cloud, large enterprises frequently struggle to successfully deliver their migrations. This blog post will discuss the key factors that ensure a successful migration, and how KeyCore can help customers achieve their goals.

At KeyCore, our experts are highly knowledgeable in AWS and will help customers achieve all their business transformation goals. With our professional services and managed services, we can help customers deliver successful Cloud Migrations – no matter how complex the migration may be. Contact us today to learn how we can help you.

Read the full blog posts from AWS

AWS for Industries

AWS for Industries

Reinventing Energy with Energy Data Insights™ on AWS for the OSDU® Data Platform:

Today’s world relies heavily on a secure and reliable energy supply. Oil and gas companies are facing an array of challenges across operations and business, from transitioning into renewable energy sources to volatile commodity prices and environmental regulations, plus the ever-present pressure to find, produce, and refine hydrocarbons in a sustainable manner. To help meet these demands, Amazon Web Services (AWS) brings the power of the cloud to the oil and gas industry with the Open Subsurface Data Universe (OSDU) Data Platform and Energy Data Insights™.

The OSDU Data Platform is a cloud-native, data-lake architecture designed to manage the complexities of energy data and provide a compliant and secure approach to optimization and analysis. Using the advanced analytics capabilities of AWS, OSDU Data Platform helps drive improved collaboration and productivity, resulting in better insights and faster decision-making. Energy Data Insights™ is a suite of applications and services specifically designed for the energy industry, built on top of the OSDU Data Platform. This includes applications for production optimization, asset performance, and modelling tools.

With the OSDU Data Platform and Energy Data Insights™, AWS can help oil and gas companies to reduce IT expense, develop new insights from previously untapped data, and accelerate towards digital transformation. Through the expertise of KeyCore, an AWS consulting partner, oil and gas companies can leverage the full suite of AWS services to develop a secure, reliable energy supply.

Onebeat reduces retail waste through AI-powered inventory optimization:

Retail stores can have a significant environmental impact if they do not properly manage their inventory. Amazon Web Services (AWS) Retail Competency Partner Onebeat has developed an AI-powered inventory optimization platform that tackles the sustainability challenge by taking a holistic approach to inventory management. The platform uses advanced analytics to enable retailers to reduce waste and costs while increasing sales.

Onebeat’s platform is powered by AWS with Machine Learning and an AI engine to provide valuable insights that help retailers optimize their inventory management. It also allows retailers to set up alerts and notifications to quickly identify opportunities for improving their inventory strategies. This helps them increase sales while reducing waste and costs.

AWS for Telecom programming highlights at re:Invent 2023:

AWS re:Invent 2023 is the largest event in the Amazon Web Services (AWS) calendar, bringing the cloud computing community together to connect, collaborate, and learn. With hundreds of sessions, workshops, and labs, it is the perfect place for telco professionals to explore how AWS can help transform their business. Participants can attend sessions on a range of topics such as network automation, edge computing, customer experience, and more.

KeyCore, as an AWS consulting partner, can provide telco professionals with the expertise they need to take advantage of the AWS platform. KeyCore can provide telco-specific services, such as DevOps, migration, and application development, to help telco companies leverage the power of AWS.

6 takeaways from GroceryShop 2023:

GroceryShop 2023 was an event that brought together various members of the Grocery ecosystem to learn from each other. Several interesting takeaways from the event have emerged, such as the need for mobile-first checkout solutions and the importance of providing a personalized shopping experience. Additionally, it was highlighted that the customer experience needs to be optimized, and that there is an opportunity to leverage data and AI technology to identify customer needs.

KeyCore’s AWS consulting services can help Grocery companies develop and deploy mobile-first checkout solutions, as well as provide personalized shopping experiences. Through expert migration and application development, KeyCore can help Grocery companies leverage the power of the AWS platform to meet customer needs and optimize their customer experience.

How SGN extended isolated networks to AWS using AWS Transit Gateway:

SGN is a UK gas distribution network operator that manages the distribution network for natural and green gas to almost six million homes and businesses across Northern Ireland, Scotland, and southern England. In order to efficiently manage their network, SGN needed to extend their isolated networks into the cloud, and for this they turned to Amazon Web Services (AWS) Transit Gateway.

AWS Transit Gateway acts as a hub for connecting VPCs, on-premises networks, and remote networks. It simplifies the process of connecting isolated networks to the cloud, and provides a secure and efficient way to manage traffic flows. With AWS Transit Gateway, SGN was able to connect their isolated networks to AWS, allowing them to take advantage of cloud services and provide secure access to their applications.

Kasada beats bots at their own game: How to identify and eliminate bot attacks:

Bots can be used for helpful activities, such as interacting with customer service, but they can also have a much more sinister side. To combat malicious bots, Kasada has developed an automated anti-bot solution that uses machine learning and AI to identify and block bot attacks. The solution uses a variety of techniques, such as frequency filtering, behavioral analysis, and machine learning, to accurately detect malicious bots.

KeyCore can help organizations protect themselves against bot attacks by leveraging the power of the AWS platform. Our expert DevOps, migration, and application development services can help organizations to develop and deploy automated anti-bot solutions that use AWS services such as Machine Learning and AI.

Identity and Access Management solution on AWS:

Retail websites need to ensure that only authorized customers have access to their websites and applications. To achieve this, they need an Identity and Access Management (IAM) solution. Amazon Web Services (AWS) provides a comprehensive Identity and Access Management solution that can help retailers set themselves apart from competitors by providing secure and compliant access to their website.

The AWS IAM solution leverages a range of AWS services, such as multi-factor authentication, single sign-on, and access control policies, to verify users and manage access. The AWS IAM solution also helps retailers meet regulatory and legal requirements, such as GDPR and PCI DSS, while keeping customer data secure.

KeyCore can help retailers leverage the power of AWS to develop a secure Identity and Access Management solution. Our expert DevOps, migration, and application development services can help retailers to develop, deploy, and maintain an IAM solution on the AWS platform.

AWS Service Spotlight: Amazon Forecast for accurate demand forecasting:

Accurate demand forecasting is key to maintaining an effective balance between meeting customer demand and efficiently using resources. Amazon Web Services (AWS) provides Amazon Forecast, a fully managed service that uses machine learning to produce highly accurate forecasts. Amazon Forecast can be customized to meet specific forecasting requirements, such as sales forecasting, demand forecasting, and resource planning.

The Amazon Forecast service uses advanced machine learning algorithms to automatically generate forecasts and can be integrated with other AWS services, such as Amazon QuickSight. This allows businesses to gain greater insights into their forecasts and take advantage of the full range of AWS services.

KeyCore can help businesses with their demand forecasting by leveraging the power of AWS. Our expert DevOps, migration, and application development services can help businesses to integrate Amazon Forecast with other AWS services and gain greater insights into their forecasts.

Ecosystems can help generate new revenue streams for banks:

The COVID pandemic has increased customer demand for hyper-connected digital experiences, and banks are speeding up the development of their strategies and solutions to deliver Banking as a Service (BaaS) and ecosystem banking capabilities to their customers and partners. Banks can make use of APIs and microservices to develop ecosystems that can generate new revenue streams.

KeyCore can help banks to leverage the power of the AWS platform to develop their Banking as a Service and ecosystem banking capabilities. Our expert DevOps, migration, and application development services can help banks to develop and deploy APIs and microservices on the AWS platform to create ecosystems that generate new revenue streams.

How digital twins can optimize Travel and Hospitality operations:

Travel and Hospitality (T&H) businesses need to use technology to optimize their operations. Digital twins can help with this by providing a virtual representation of physical objects and processes, which can be used to identify potential issues and plan for future operations.

The AWS platform provides a range of services to support the development and deployment of digital twins. AWS services such as Amazon Machine Learning, Amazon Augmented AI, and Amazon SageMaker can be used to create and manage digital twins. AWS CloudFormation and AWS Command Line Interface can be used to deploy and manage the digital twins.

KeyCore can help organizations leverage the power of AWS to develop and deploy digital twins. Our expert DevOps, migration, and application development services can help organizations to create and manage digital twins on the AWS platform.

Read the full blog posts from AWS

AWS Marketplace

Accelerate Self-Service Analytics Integrating On-Premises and Third-Party Data with AWS Data Exchange and Dremio

Businesses are increasingly aware of the power of data, and many have invested heavily in data-driven initiatives. However, in order to truly unlock the potential of data, businesses must be able to access, integrate, and analyze data from multiple sources quickly and accurately. AWS Data Exchange provides a solution to this challenge, allowing businesses to access third-party data quickly and easily without the need to build and maintain complex data pipelines.

In this blog post, we explore how businesses can use AWS Data Exchange with their on-premises Hive-compliant data source using Dremio to integrate third-party and on-premises data without moving or copying data. We also demonstrate how customers can use the consolidated data for business intelligence (BI) and exploratory analytics.

Join Third-Party Data in Amazon Redshift with Snowflake Using Amazon Athena

Integrating data from multiple sources is one of the most challenging aspects of data-driven initiatives. With AWS Data Exchange, businesses can quickly and easily access and integrate data from third-party sources into their own data pipelines. In this blog post, we demonstrate joining data from Snowflake with data shared from a third party provider via AWS Data Exchange in Amazon Redshift. This solution lets you access and combine data from all these resources without needing to build and maintain complex data pipelines.

Quant Research at Scale Using AWS and Refinitiv Data

Effective data-driven decision-making relies on the ability to process large datasets quickly and accurately. In this blog post, Alex, Pramod, and I will show how to install and use the infrastructure we built to perform quant research at scale. We made the stack and examples available in the public repository so you can use it in your own investment research. This solution uses Apache Spark, Amazon EMR on EKS, Docker, Karpenter, EMR Studio Notebooks, and AWS Data Exchange for Amazon S3.

AWS Data Exchange and the other services discussed in this blog post can help businesses take advantage of the benefits of data-driven initiatives. With AWS Data Exchange, customers can access and integrate data from third-party sources quickly and easily. Apache Spark, Amazon EMR on EKS, Docker, Karpenter, and EMR Studio Notebooks provide a powerful infrastructure for processing and analyzing large datasets. KeyCore can help customers make the most of these services and more, offering professional consulting and managed services to accelerate their data-driven initiatives.

Read the full blog posts from AWS

The latest AWS security, identity, and compliance launches, announcements, and how-to posts.

The Latest AWS Security, Identity, and Compliance Launches, Announcements, and How-To Posts

At KeyCore, we understand the importance of being up-to-date on the latest AWS security, identity, and compliance launches, announcements, and how-to posts to ensure our clients are properly protected. In the following article, we will summarize the most-up-to-date security information from various sources so that our readers can stay informed.

How to Create an AMI Hardening Pipeline and Automate Updates to Your ECS Instance Fleet

Amazon Elastic Container Service (Amazon ECS) is a comprehensive managed container orchestrator that simplifies the deployment, maintenance, and scalability of container-based applications. Customers can use Amazon ECS to deploy their containerized application as a standalone task or as part of a service in their cluster. Amazon ECS also provides an infrastructure for tasks that includes Amazon…To create an AMI hardening pipeline, customers must use Amazon Machine Images (AMI). An AMI is a snapshot of an EC2 instance that contains the root volume of an instance, environment variables, and instance configuration such as the operating system, installed applications, and network settings.

How to Use Chaos Engineering in Incident Response

Simulations, tests, and game days are critical parts of preparing and verifying incident response processes. To build an effective incident response function, customers can use chaos engineering to accelerate their process. Chaos engineering is the practice of running experiments to test how systems respond to failures. It allows customers to identify weaknesses in their system and develop strategies to minimize the risk and impact of unexpected incidents.

Approaches for Migrating Users to Amazon Cognito User Pools

Amazon Cognito user pools offer a fully managed OpenID Connect (OIDC) identity provider to quickly add authentication and control access to mobile apps or web applications. User pools scale to millions of users and use advanced security features such as multi-factor authentication, account recovery, and account takeover protection. To migrate users to an Amazon Cognito user pool, customers can use the data migration feature to import users from identity stores such as Microsoft Active Directory (AD) and in-house databases.

How to Share Security Telemetry per OU Using Amazon Security Lake and AWS Lake Formation

Amazon Security Lake and AWS Lake Formation are powerful tools to help customers ingest, store, analyze, and visualize security data from multiple sources. Customers can use Amazon Security Lake to aggregate data from multiple sources, apply security analytics to find potential security threats, and visualize findings in Amazon QuickSight. To share security telemetry per Organizational Unit (OU) using Amazon Security Lake and AWS Lake Formation, customers can use AWS Lake Formation to create a data lake and define data access policies.

Aggregating, Searching, and Visualizing Log Data from Distributed Sources with Amazon Athena and Amazon QuickSight

Customers using Amazon Web Services (AWS) can use a range of native and third-party tools to build, secure, and manage workloads. To aggregate, search, and visualize log data from distributed sources, customers can use Amazon Athena and Amazon QuickSight. Amazon Athena is an interactive query service that can query data stored in Amazon Simple Storage Service (Amazon S3), and Amazon QuickSight is an analytic service to quickly build visualizations and dashboards from data.

How to Visualize Amazon Security Lake Findings with Amazon QuickSight

Amazon Security Lake and AWS Lake Formation can be used to ingest, store, analyze, and visualize security data from multiple sources. To visualize Amazon Security Lake findings with Amazon QuickSight, customers must use Amazon Athena and Amazon QuickSight together. Customers can use Amazon Athena to query data stored in Amazon S3 and Amazon QuickSight to build visualizations and dashboards from the data quickly.

Refine Permissions for Externally Accessible Roles Using IAM Access Analyzer and IAM Action Last Accessed

When building on Amazon Web Services (AWS) across accounts, customers might use an AWS Identity and Access Management (IAM) role to allow an authenticated identity from outside their account to access the resources in their account. To refine permissions for externally accessible roles, customers can use IAM Access Analyzer and IAM Action Last Accessed. IAM Access Analyzer examines resource policies to ensure they are not overly permissive, and IAM Action Last Accessed examines the access patterns of IAM entities to identify unused permissions.

Security Considerations for Running Containers on Amazon ECS

When running containers on Amazon Elastic Container Service (Amazon ECS), customers can use the six tips provided by Amazon Web Services (AWS) container and security subject matter experts to help raise their container security posture. These tips include selecting the right base image, using AWS Secrets Manager, leveraging Amazon ECS Task IAM roles, using an Amazon ECS security group, using the Amazon ECS image scanner, and configuring Kubernetes RBAC.

Transforming Transactions: Streamlining PCI Compliance Using AWS Serverless Architecture

To comply with the Payment Card Industry Data Security Standard (PCI DSS), organizations must establish rules for secure payment processing. To streamline PCI compliance, customers can use serverless technology, which offers agility, performance, cost, and security. With the help of AWS serverless architecture, customers can reduce their PCI compliance scope by using serverless services such as Amazon API Gateway, AWS Lambda, Amazon Kinesis, and Amazon DynamoDB.

Prepare Your AWS Workloads for the “Operational Risks and Resilience – Banks” FINMA Circular

The Swiss Financial Market Supervisory Authority (FINMA) announced a revised circular called Operational risks and resilience – banks, which will take effect on January 1, 2024. To prepare for this circular, organizations must make sure their AWS workloads are compliant with FINMA’s requirements. KeyCore can help with this by providing advice and guidance on the best practices for preparing and maintaining AWS workloads in order to meet FINMA’s requirements.

Scaling National Identity Schemes with itsme and Amazon Cognito

To quickly consume and build digital services for citizens on Amazon Web Services (AWS) using available national digital identities, customers can use identity federation and integration between the identity provider itsme® and Amazon Cognito. KeyCore can help customers get started quickly by providing code examples and integration proofs of concept. Additionally, KeyCore can assist customers in designing their identity federation and integration, as well as deploying, managing, and optimizing their digital services.

Evolving Cyber Threats Demand New Security Approaches – The Benefits of a Unified and Global IT/OT SOC

Evolving cyber threats require organizations to implement a unified and global information technology and operational technology (IT/OT) security operations center (SOC). KeyCore can provide advice and guidance on the best practices for building, managing, and operating a unified and global IT/OT SOC. We can also help customers with assessing their security needs, designing and deploying their SOC, and implementing security solutions.

Read the full blog posts from AWS

Business Productivity

Understanding and Monitoring Anomalous Behavior Across Multiple SaaS Applications with AWS AppFabric

As organizations increasingly adopt more and more SaaS applications, the need for cybersecurity teams to monitor and identify vulnerabilities in their security systems increases in step. With AWS AppFabric, these professionals have access to a service that can quickly and securely connect multiple applications together, allowing them to monitor for anomalous behavior and gain insight into user access to sensitive data.

Cross-Application Audit Log Analysis

Securing access to company data requires a careful monitoring of user access. With more and more applications in use, the audit logs for each application can differ with different schemas being used. AppFabric helps security professionals to connect multiple applications and analyze audit logs across different applications, providing a comprehensive view of user access to sensitive data.

How AWS AppFabric Can Help

AWS AppFabric is a service that helps security professionals to quickly connect multiple applications together and analyze audit logs across those applications. This in turn allows teams to quickly identify and fix vulnerabilities, as well as monitor for anomalous behavior, helping to keep company data secure.

How KeyCore Can Help

At KeyCore, we can help your organization get the most out of AWS AppFabric. Our experienced team of AWS-certified professionals can help you to set up AppFabric and configure it to ensure optimal performance. We can also help you to identify any vulnerabilities that need to be addressed and help you to develop a secure monitoring system for your organization.

Read the full blog posts from AWS

Front-End Web & Mobile

Build an API for Amazon Bedrock with WebSockets and AppSync

Generative AI is transforming the way applications interact with data and is creating new challenges for developers who are building applications with the generative AI service, Amazon Bedrock. Bedrock is a fully managed service that allows developers to easily build and scale generative AI applications with Foundation Models (FMs).

Leverage AppSync and GraphQL APIs

AWS AppSync and GraphQL APIs can be used to easily connect Amazon Bedrock FMs and Agents to both public and private APIs and databases. By leveraging AppSync and GraphQL, developers are able to build a real-time, WebSockets API for Amazon Bedrock that simplifies the development and maintenance of generative AI applications.

Modern Tooling for Websites

In this tutorial, we’ll build a blog website from start to finish with an A-level Lighthouse performance scale and low maintenance friction. Content is stored using a Contentful model, accessed by a Next.js app, and deployed with Next.js with an AWS Lambda function. By combining AWS, Contentful, and Next.js, developers are able to create a fast-loading website with modern tooling.

KeyCore – Your AWS Consultancy Partner

At KeyCore, we understand the importance of leveraging the latest technology to develop websites that are robust, reliable, and high-performing. We provide both professional services and managed services to ensure our customers have the best experience possible when building applications with generative AI. To learn more about KeyCore and our offerings, visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Contact Center

Using Generative AI for Contact Center Customer Experience

Generative AI is an area of increasing interest for businesses, with Gartner estimating that by 2024, 40% of enterprise applications will have embedded conversational AI, a number that was less than 5% in 2020. AWS customers are often interested in how they can use generative AI for contact centers, which is why the Amazon Connect team is gearing up to welcome attendees from around the globe to AWS re:Invent 2023 in Las Vegas from November 27 – December 1.

Evaluating Generative AI for Customer Experience

Generative AI can be used to help contact centers automate processes, enhance customer experience, and improve customer service. Before investing in generative AI, contact center leaders must evaluate their customer service needs against the capabilities of the AI. It is important to identify opportunities to leverage generative AI to improve customer experience.

First, contact center leaders should review the customer journeys that matter most to their customers. This will help identify areas of improvement and opportunities to use generative AI to provide a better experience. Generative AI can be used in a variety of customer service scenarios, such as providing automated answers to frequently asked questions or providing personalized recommendations. It can also be used to automate mundane tasks, such as collecting customer data and creating customer profiles.

Once the customer experience needs have been identified, contact center leaders should evaluate the available generative AI solutions. Generative AI solutions range from conversational AI to natural language processing (NLP). It is important to understand the capabilities of each solution and how they can be used to improve customer experience. Contact center leaders should also evaluate the cost of the solutions and the ease of integration with existing systems.

Amazon Connect and Generative AI

Amazon Connect provides a range of features that can be used to improve customer experience. These include automated self-service, proactive customer outreach, and the ability to quickly create customer profiles. Amazon Connect also integrates with AWS services, such as Amazon Lex and Amazon Comprehend, enabling customers to use the latest generative AI technology.

At AWS re:Invent 2023, attendees can learn how to use Amazon Connect and generative AI to improve customer experience. Presentations, hands-on learning experiences, and more, curated by the Amazon Connect team, will be available. Attendees can also gain insight from experienced professionals on how to use generative AI to optimize customer service.

KeyCore and Generative AI

At KeyCore, we provide professional and managed services for AWS. Our team of experts can help contact center leaders evaluate and deploy generative AI solutions for customer service. We are highly advanced in AWS and can help you integrate generative AI into your contact center using CloudFormation YAML and AWS API Calls with Typescript and the AWS SDK for JavaScript v3.

If you are interested in utilizing generative AI for contact center customer experience, contact KeyCore today. Our team of experts can help you evaluate your customer service needs and identify areas of improvement. We can also help you evaluate the available generative AI solutions and integrate them into your existing systems.

Read the full blog posts from AWS

Innovating in the Public Sector

Innovating in the Public Sector with Kiip and AWS Partners

Unhoused individuals often lack the ability to prove their identity, which can lead to great difficulty in finding resources and stability. However, thanks to AWS-powered solutions like Kiip, unhoused individuals are gaining access to and control over vital documents that can help them prove their identity. Additionally, AWS partners are providing public sector organizations with the tools they need to make the most of their data, driving innovation and helping them accomplish their organizational goals.

Kiip Empowers Unhoused Individuals

Kiip is an innovative solution that is powered by Amazon Web Services (AWS). It provides unhoused individuals with access to and control over their own personal, vital documents. By doing so, it helps individuals to prove their identity, which can open the door to necessary resources and stability. This is a unique approach to addressing the problem posed by lack of proper documentation.

AWS Partners Help Public Sector Organizations Harness the Power of Data

Organizations are dealing with ever-growing data volumes, so they must put data at the heart of every application, process, and decision. AWS partners are helping public sector organizations make the most of their data. This means they can accelerate their innovation and accomplish their organizational goals more easily. Having access to the right data can be the key to making progress.

How KeyCore Can Help

At KeyCore, we specialize in providing both professional services and managed services related to AWS. Our advanced knowledge of AWS means we can help public sector organizations make the most of their data and benefit from cutting-edge solutions such as Kiip. Whether you need assistance setting up a new system or ongoing maintenance and support, we can provide the services you need. To learn more about our offerings, please visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Open Source Blog

Introducing Finch 1.0 – The Open Source Container Developer Tool for macOS

Finch, an open source, command line developer tool for building, running, and publishing Linux containers, is now ready for production use as a container developer’s daily tool on macOS. The project was built to make it easier for developers to work with containers efficiently. The project is available on GitHub and is ready to be used in production.

What is Finch?

Finch is an open source command line tool written in Go that lets developers set up and manage their Docker containers on macOS. It is designed to make it easier for developers to build, run, and publish the containers they create quickly and easily. Finch helps developers build, run, and publish containers without having to learn multiple complex tools.

What Does Finch Offer?

Finch offers a number of features to make it easier for developers to work with containers efficiently. It provides command-line options to easily start and stop containers, view container logs, make changes to running containers, and more. Additionally, Finch contains a suite of pre-built tools and plugins that make it easier to work with Kubernetes, Docker Compose, and more.

How Does Finch Compare to Other Container Tools?

Finch is designed to be simpler and more efficient than other container tools. It is designed to make it easier for developers to get up and running with containers quickly, without having to learn complex tools. Additionally, Finch is open source and can be extended with additional features via plugins.

How Can KeyCore Help?

At KeyCore, we are experts in working with AWS. We provide professional services and managed services to help customers maximize the value of their AWS investments. We can work with you to assess your existing container architecture and make recommendations to ensure your container deployments are performing optimally. We can also help you extend Finch with custom plugins and features to meet your specific needs. Contact us today to learn more about how we can help you get the most out of Finch and your container deployments.

Read the full blog posts from AWS

Scroll to Top