Summary of AWS blogs for the week of monday Mon Jun 12

In the week of Mon Jun 12 2023 AWS published 105 blog posts – here is an overview of what happened.

Topics Covered

Desktop and Application Streaming

AWS End User Computing at Customer Contact Week 2023

AWS End User Computing (EUC) provides an innovative way to help customers communicate and connect with customers. AWS EUC is a customer-oriented technology designed for contact centers and customer service teams. It enables them to use and evolve their existing contact center infrastructure while providing customers with an integrated customer experience.

At Customer Contact Week 2023, AWS EUC will be on full display. This event serves as a platform for industry executives to connect, exchange ideas, and build relationships with customers. It showcases innovative ideas, best practices, and the latest trends to shape the future of customer communication. Customer Contact Week is taking place on June 19 – 22, in Las Vegas.

AWS EUC Benefits

AWS EUC provides several advantages to contact center operators looking to enhance their customer experience. It offers a cost-effective solution, reducing the need for additional hardware and software. Additionally, it provides improved scalability, allowing contact centers to expand their customer base with ease. With its ease of installation and integration, AWS EUC is a straightforward and convenient way to improve customer contact.

With AWS EUC, customer contact centers can benefit from advanced features such as voicemail automation, interactive voice response (IVR), call routing, and real-time analytics. AWS EUC also ensures high-quality customer service and data security. It offers built-in compliance solutions to ensure customer data remains secure and is handled responsibly. This is especially important for contact centers who work with sensitive customer data.

KeyCore’s Expertise in AWS End User Computing

For contact centers looking to use AWS EUC to its fullest capabilities, KeyCore can help. We provide professional and managed services to develop and maintain modern contact center solutions. Our team is highly advanced in AWS and is capable of responding with highly technical details and references to AWS documentation or code snippets in CloudFormation YAML or AWS API Calls using Typescript and AWS SDK for JavaScript v3.

KeyCore can help contact centers develop an AWS EUC solution that meets their unique business needs and objectives. We work closely with our clients to ensure their customer data remains secure, while providing them with a scalable and cost-effective solution. By leveraging our expertise, contact centers can build a secure and effective customer contact solution.

At KeyCore, our mission is to provide our clients with the best AWS-based solutions. We are committed to helping contact centers build a customer contact system that meets their needs and delivers the best customer experience.

Summary

At Customer Contact Week 2023, AWS End User Computing (EUC) will be showcased. This innovative technology offers a cost-effective and scalable solution for contact centers to manage customer communications. It provides advanced features like voicemail automation, interactive voice response (IVR), call routing, and real-time analytics. In addition, it ensures data security and compliance.

For contact centers looking to take advantage of AWS EUC, KeyCore provides professional and managed services to develop and maintain modern contact center solutions. Our team is highly experienced in AWS and can help contact centers build a secure and effective customer contact system. We are committed to helping our clients build an AWS EUC solution that meets their unique business needs and objectives.

Read the full blog posts from AWS

Official Machine Learning Blog of Amazon Web Services

Official Machine Learning Blog of Amazon Web Services

SambaSafety Automates Custom R Workload, Improving Driver Safety with Amazon SageMaker and AWS Step Functions

At SambaSafety, their mission is to promote safer communities by reducing risk through data insights. To do this, they rely on Amazon SageMaker and AWS Step Functions to automate custom R workloads. SambaSafety serves more than 15,000 global employers and insurance carriers around the world, providing driver risk and compliance monitoring, online reporting, and corporate analytics. With the help of Amazon SageMaker and AWS Step Functions, SambaSafety has been able to reduce their compute time by up to 95%.

Build a Multilingual Automatic Translation Pipeline with Amazon Translate Active Custom Translation

Dive into Deep Learning (D2L.ai) is an open-source textbook that makes deep learning accessible to everyone. To make this content accessible to a wider audience, D2L wanted to provide the textbook in multiple languages. With Amazon Translate Active Custom Translation, D2L can build a fully automated, multilingual translation pipeline to quickly translate its content into multiple languages. This pipeline uses Amazon Translate to automatically translate content and then Amazon Rekognition to review the translated content.

Bring SageMaker Autopilot into Your MLOps Processes Using a Custom SageMaker Project

Amazon SageMaker provides a set of templates and tools to help organizations quickly build, train, and deploy machine learning (ML) models. To meet specific organizational needs, SageMaker Autopilot allows customers to customize their MLOps processes. With a custom SageMaker project, customers can create their own MLOps process and integrate it with their SageMaker pipelines, providing a more tailored and secure MLOps experience.

How Forethought Saves Over 66% in Costs for Generative AI Models Using Amazon SageMaker

Forethought Technologies Inc. is a leading generative AI suite for customer service. To maximize the value of their solution, Forethought leverages Amazon SageMaker to transform the customer support lifecycle. By taking advantage of SageMaker’s cost-saving features, Forethought has been able to save over 66% in costs for their generative AI models.

Reinventing the Data Experience: Use Generative AI and Modern Data Architecture to Unlock Insights

Organizations can leverage generative AI solutions to maximize the value of their modern data architecture. Generative AI solutions allow organizations to automate data processing and analysis tasks, providing a more efficient way to access insights from data. By innovating continuously with generative AI, organizations can unlock insights faster and more efficiently.

How BrainPad Fosters Internal Knowledge Sharing with Amazon Kendra

Many companies struggle with internal knowledge sharing. BrainPad, a consulting company that helps organizations develop solutions using Amazon Web Services (AWS), recognized this challenge and developed an innovative solution. Using AWS Lambda and Amazon Kendra, BrainPad was able to structure internal knowledge sharing and create a more collaborative work environment.

AWS Inferentia2 Builds on AWS Inferentia1 by Delivering 4x Higher Throughput and 10x Lower Latency

AWS Inferentia2 was designed to deliver higher performance and lower cost for large language models (LLMs) and generative AI inference. It provides 4x higher throughput and 10x lower latency than AWS Inferentia1, making it an ideal choice for LLMs and generative AI inference.

Deploy Falcon-40B with Large Model Inference DLCs on Amazon SageMaker

TII Falcon LLM is an open-source foundational large language model (LLM) trained on 1 trillion tokens with Amazon SageMaker. It can support a wide range of use cases, including text classification, token classification, text generation, question and answering, entity extraction, summarization, and sentiment analysis. It is also lightweight and less expensive to host than other LLMs.

Build Custom Chatbot Applications Using OpenChatkit Models on Amazon SageMaker

Open-source large language models (LLMs) are an ideal choice for building custom chatbot applications. These models provide transparency in the model architecture, training process, and training data, making it easier to understand how chatbots work and allowing organizations to develop custom applications. Amazon SageMaker makes it easy to deploy open-source models and build custom chatbot applications.

Fine-tune GPT-J Using an Amazon SageMaker Hugging Face Estimator and the Model Parallel Library

GPT-J is an open-source 6-billion-parameter model trained on the Pile. It can support a wide range of use cases, including text classification, token classification, text generation, and more. To fine-tune GPT-J, organizations can use an Amazon SageMaker Hugging Face estimator and the Model Parallel Library. This will help them quickly get up and running with GPT-J in a cost-effective and efficient manner.

KeyCore Helps with Official Machine Learning Blog of Amazon Web Services

At KeyCore, we provide both professional services and managed services for AWS. Our team of experienced AWS professionals can help you get the most out of your ML projects on AWS. We can provide guidance on how to best structure your MLOps processes using SageMaker Autopilot and help you build custom chatbot applications using open-source models. We can also help you optimize your costs for generative AI models and deploy large model inference DLCs on SageMaker. Contact us today to learn more about how we can help.

Read the full blog posts from AWS

Announcements, Updates, and Launches

GoDaddy Implemented a Multi-Region Event-Driven Platform at Scale

GoDaddy is a leading provider of domain registration and web hosting services, with over 84 million domains and 22 million customers since its establishment in 1997. To improve customer outcomes, the company built its own Customer Signal Platform which captures, analyses and acts on customer and product data.

AWS Silicon Innovation Day

AWS Silicon Innovation Day is a one-day virtual event on June 21, 2023, that will allow attendees to better understand how to use silicon to power customer outcomes and innovations. It will be streamed on multiple platforms, including LinkedIn Live, Twitter, YouTube, and Twitch.

Amazon S3 Dual-Layer Server-Side Encryption with Keys Stored in AWS Key Management Service (DSSE-KMS)

Amazon recently launched Amazon S3 dual-layer server-side encryption with keys stored in AWS Key Management Service (DSSE-KMS). This encryption option applies two layers of encryption to objects when they are uploaded to an Amazon Simple Storage Service (Amazon S3) bucket. DSSE-KMS is designed to meet National Security Agency CNSSP standards.

Simplify How You Manage Authorization in Your Applications with Amazon Verified Permissions – Now Generally Available

When developing an application or integrating an existing one into a new environment, user authentication and authorization can be difficult to implement correctly. Now, developers can use an external identity provider like Amazon Cognito for authentication, and Amazon Verified Permissions for authorization. Amazon Verified Permissions makes authorization logic easier to manage.

AWS Week in Review – Automate DLQ Redrive for SQS, Lambda Supports Ruby 3.2, and More – June 12, 2023

The latest news and updates from AWS includes the ability to Automate DLQ Redrive for SQS, Lambda now supports Ruby 3.2, and more. AWS Summits are also taking place around the world, giving attendees the opportunity to learn more about event-driven architectures with serverless.

Move Payment Processing to the Cloud with AWS Payment Cryptography

Cryptography is used in many aspects of daily life, and AWS provides multiple services and capabilities to help manage keys and encryption. With AWS Payment Cryptography, you can move payment processing to the cloud and improve the security and reliability of your payments.

At KeyCore, we understand the challenge of developing applications or integrating existing ones into new environments. Our team of AWS experts can help you manage your authentication and authorization logic, as well as help you take advantage of the latest AWS services and capabilities. Contact us today to learn more.

Read the full blog posts from AWS

Containers

Container Cost & Compatibility Monitoring with Amazon EKS

Amazon Elastic Kubernetes Service (Amazon EKS) is a great platform for running containerized applications on the AWS cloud. And with the Amazon Managed Service for Prometheus (AMP) you can easily monitor and provide alerts on containerized applications and infrastructure at scale.

Integrating Kubecost with Amazon Managed Service for Prometheus

Kubecost is a third-party tool that helps with multi-cluster cost monitoring for Amazon EKS. By integrating Kubecost with Amazon Managed Service for Prometheus (AMP) you can gain granular visibility into your Amazon EKS clusters, giving you a deeper understanding of your costs.

Validate Third-Party Software with Conformitron

The Conformitron is a tool that can help validate third-party software for compatibility with Amazon EKS and Amazon EKS Anywhere. This allows customers to ensure applications are compatible with their chosen container orchestrator upfront, before investing time and money. Continuous compatibility testing can be done for all available dimensions, such as versions of Amazon EKS, OS, Kubernetes versions, etc.

Container Insights Cost Optimizations with Amazon EKS

Amazon CloudWatch Container Insights helps to collect, aggregate, and analyze metrics and logs for container-based applications and infrastructure on AWS. It captures metrics for CPU, memory, disk, and network, along with diagnostic data like container restart failures. This allows you to efficiently isolate and resolve problems.

KeyCore Can Help

At KeyCore, we provide professional and managed services for Amazon EKS. Our team of experienced AWS consultants can help you get the most out of Amazon EKS, from cost optimization and monitoring to compatibility testing and troubleshooting. Contact us today to learn more about our services and see how we can help.

Read the full blog posts from AWS

AWS Quantum Technologies Blog

Amazon Braket Gets SOC-2 Security Certification

Amazon Braket now has a SOC-2 report, ensuring the security, availability, confidentiality, and privacy of all its services. This latest report is a testament to Amazon Braket’s commitment to security, which is paramount in the development of quantum computing.

What is SOC-2?

The Service Organization Control (SOC) 2 is an auditing procedure from the American Institute of Certified Public Accountants (AICPA). It’s designed to measure how well a service organization’s system is designed to meet the security and availability requirements set by the AICPA.

Why is SOC-2 Important?

The SOC-2 report is an assurance that Amazon Braket is taking the security, confidentiality, and privacy of its services seriously. It also provides customers and other stakeholders with additional assurance that their data is secure.

KeyCore and Amazon Braket

At KeyCore, we specialize in Amazon Web Services (AWS), and provide professional services and managed services to our clients. We understand the importance of security and believe that quantum computing should be secure as any other technology. We can help you get the most from Amazon Braket with our comprehensive services.

To learn more about KeyCore and our services, visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

AWS Smart Business Blog

Small and Medium Businesses Developing a Modern Data Strategy with AWS

In today’s era of big data, small and medium-sized businesses (SMBs) are often inundated with data from a wide array of sources. With this comes the challenge of managing and making use of this data in the most effective way. According to Gartner, 60 percent of organizations fail to measure the costs of poor data quality, which results in reactive responses to data quality issues and missed opportunities due to low data usability.

Developing a Data Strategy

Having a robust data strategy is essential for SMBs to capitalize on the data they are collecting. The key to success is to establish a comprehensive data strategy and operational plan that outlines the decisions, processes, and investments needed to transform raw data into valuable insights. To get started, SMBs should consider the following steps:

1. Establish Data Governance and Data Quality

Organizations should ensure that they have an established data governance and data quality framework in place. This should include best practices for data collection, storage, and usage, as well as data quality metrics and processes for ensuring data accuracy.

2. Invest in Data Infrastructure

SMBs should invest in the right data infrastructure to ensure that their data can be collected, stored, and managed reliably. AWS Cloud services such as Amazon Redshift, Amazon S3, Amazon EMR, and Amazon Athena can be used to build a reliable and cost-effective data infrastructure.

3. Leverage Automation

To maximize efficiency, SMBs should leverage automation to streamline their data management processes. This can include automating data collection, cleansing, mining, and storage, as well as automating data analysis and insights generation.

4. Invest in Analytics

Having the right analytics tools in place is essential for SMBs to make sense of their data and turn it into valuable insights. Investing in analytics solutions such as Amazon QuickSight can help SMBs make sense of their data and uncover hidden trends and opportunities.

KeyCore Can Help

At KeyCore, we understand the importance of having a modern data strategy for SMBs. We provide comprehensive AWS services to help SMBs design and implement a data strategy that can unlock the true potential of their data. Our team of AWS certified professionals can help SMBs create a comprehensive data strategy and develop the necessary data infrastructure and analytics solutions to drive data-driven insights and decisions. Contact us today to learn more about how we can help you develop a modern data strategy.

Read the full blog posts from AWS

Official Database Blog of Amazon Web Services

Managing Partition Keys and Bulk Operations with DynamoDB and Amazon RDS

Amazon DynamoDB is a serverless NoSQL database service that provides fast and predictable performance with seamless scalability. Every table in DynamoDB has a schema which specifies if it has a simple partition key (for pure key-value lookups), or a partition key and sort key both (for more complex query patterns). Generating distinct partition keys efficiently is critical for DynamoDB performance and scalability.

Efficiently Generating Distinct Partition Keys

There are several methods for efficiently generating distinct partition keys. One way is to use hash functions, such as MD5 or SHA1, to generate a hash of the partition key. This ensures that each generated key is unique. Another method is to use a combination of the primary key and sort key to generate the partition key. This will guarantee that the keys are distinct, since the combination of the primary and sort key will always be unique.

Rate Limiting Bulk Operations in DynamoDB Shell

DynamoDB Shell (ddbsh) is an open-source command line interface for Amazon DynamoDB. It provides a simple and intuitive environment for bulk operations, such as batch insert, delete, query, and update. To ensure that these operations are performed efficiently, ddbsh includes a rate limiting mechanism. This allows the user to specify a maximum number of operations to be performed in one batch, and it ensures that the operations are completed in a controlled manner.

Snapshot, Restore, and Recovery with Amazon RDS

Amazon Relational Database Service (Amazon RDS) is a managed relational database service offering. The automated service of AWS takes care of installation, storage provisioning, storage management, OS and database patching, and snapshot and restore of database instances. Offloading this undifferentiated heavy lifting of database infrastructure management to AWS helps to focus on the business logic, data, and other application-specific tasks.

Migrating SQL Server Database to Babelfish for Aurora PostgreSQL

Babelfish for Aurora PostgreSQL includes support for the SQL Server wire-protocol and T-SQL, which is the query language used in Microsoft SQL Server. This means that developers can use Babelfish to run their existing SQL Server applications on Amazon Aurora PostgreSQL-Compatible Edition without having to switch database drivers or completely rewrite their queries. To facilitate the migration process, the Bulk Copy Program (BCP) utility is used to transfer data from SQL Server databases to Babelfish for Aurora PostgreSQL.

KeyCore – Your Trusted Partner in AWS

At KeyCore, we provide both professional services and managed services for Amazon Web Services. We are highly advanced in AWS and can help you with any of the topics discussed above. Our team of experts can assist you in efficiently generating partition keys, rate limiting bulk operations, snapshotting, restoring, and recovering databases, as well as migrating SQL Server databases to Babelfish for Aurora PostgreSQL.

Read the full blog posts from AWS

AWS Cloud Financial Management

AWS Cloud Financial Management

Optimizing and saving on “other” services is a key part of cost optimization and cost control in AWS. One of the benefits of using AWS is that it allows for greater visibility into costs, which can lead to cost optimization and cost savings. In this blog, we’ll discuss how to dive deeper into the cost elements of “other” services and what can be done to optimize the spend.

Understanding Cost Elements of Other Services

It’s important to understand the cost elements of “other” services when optimizing and saving on them. By looking into the sources of these costs, identifying opportunities for cost and performance optimization can be done.

One of the advantages of using AWS for cost optimization is that it provides greater visibility into costs. This visibility can help to identify trends and patterns that might not be visible otherwise. For example, if the cost of a particular service is significantly higher than expected, it may be worthwhile to investigate the reasons why and look for potential solutions.

Cvent’s Cost Optimization Experiences

As an example, Cvent saved over $3M in less than two years by creating a cost-aware culture. By leveraging CUDOS and the CID Framework, Cvent was able to provide its centralized governance teams with greater visibility across all organizational spend. This allowed budget managers and engineers to decentralize the ownership of the budget details and shift from reactive spend investigation to proactive cost optimization during planning and deployment.

KeyCore’s Services

KeyCore can help organizations adopt an AWS cost optimization strategy and develop cost-aware cultures. Our team of AWS experts can provide guidance on the best practices for reducing costs and improving performance. We also offer managed services that can help to automate and optimize resource utilization for cost savings. In addition, our professional services team can provide tailored recommendations to ensure your organization’s AWS budget is optimized. Contact us today to learn more.

Read the full blog posts from AWS

AWS for Games Blog

Compiling Unreal Engine 5 Dedicated Servers on AWS Graviton EC2 Instances

Epic Games & AWS Collaboration

Epic Games is a leading interactive entertainment company and provider of Unreal Engine (UE), one of the most open and advanced real-time 3D creation tools. UE is responsible for powering some of the world’s leading games including Fortnite, and is also utilized by creators to deliver cutting-edge experiences across industries. Epic Games and AWS recently collaborated to build dedicated servers for UE5 on AWS Graviton EC2 instances.

Benefits of Dedicated Servers

Dedicated servers provide a more resilient and reliable host for game sessions. This method of hosting is often used for multi-player games as it provides a stable in-game environment. With AWS Graviton EC2 Instances, Epic Games can now easily deploy and manage a fleet of game servers and offer players with seamless, low-latency gameplay. This enables game developers to build larger online communities and launch new features with less downtime.

Access to Advanced Features

With the combination of UE5 and AWS Graviton EC2 Instances, game developers gain access to a range of advanced features. This includes the ability to quickly deploy dedicated servers, scale up or down based on usage, and quickly access game features with minimal latency. Developers can also take advantage of features such as auto-scaling, which allows game servers to automatically scale up or down based on the number of players.

QXR Studios Leverages Amazon GameLift Anywhere

QXR Studios is a game industry veteran creating the cyberpunk, blockchain collectible card game “Metropolis Origins”. Amazon GameLift Anywhere, which allows developers to deploy dedicated game servers to any AWS Region in just minutes, is being used by QXR Studios to accelerate game testing and development. This technology enables QXR Studios to deploy servers to over 100 AWS Regions, and benefit from the low latency and cost-effectiveness of AWS.

KeyCore Can Help

KeyCore is an advanced AWS consultancy with professional and managed services. Our team of AWS experts can help game developers leverage the power of Amazon GameLift Anywhere and AWS Graviton EC2 Instances to quickly deploy and manage dedicated servers for their games. We can provide guidance on cost optimization, scaling, and other best practices to ensure your game is able to take full advantage of the technologies offered by AWS.

Read the full blog posts from AWS

AWS Training and Certification Blog

Resilience and Determination Leads to AWS Certified Data Analytics – Specialty Certification

ITSkills4U Learner Story: Resilience in Action

Anna Prorok is an IT professional with a master’s degree in applied mathematics, and the mother of four girls ranging in age from four to 13. In 2015, Anna put her career on pause, but in 2022, as her youngest daughter prepared for kindergarten, she began thinking about re-entering the workforce.

Unfortunately, the war in her hometown of Ukraine interrupted her plans. Despite the challenges, Anna was determined to make progress towards her ambitions. She found ITSkills4U, an AWS training program, which enabled her to revitalize her skills and pursue a full-time IT role.

Gain Credibility and Advance Your Career in Analytics with AWS Certified Data Analytics – Specialty Certification

In the digital age, organizations have become increasingly reliant on data to make informed decisions. This creates a demand for professionals with expertise in data analysis and interpretation. Obtaining the AWS Certified Data Analytics – Specialty certification is an excellent way to gain credibility and progress in the field of analytics.

This blog outlines the steps to prepare for and earn the AWS Certified Data Analytics – Specialty certification. It also discusses the various career opportunities this certification can open up, even for individuals who are new to analytics.

How the AWS Certified Data Analytics – Specialty Certification Can Advance Your Career

The AWS Certified Data Analytics – Specialty certification validates an individual’s skills in designing and implementing AWS services to derive insights from data. Earning this certification demonstrates your expertise in Big Data and machine learning, as well as your ability to develop and maintain data solutions.

The AWS Certified Data Analytics – Specialty certification can open up a variety of career opportunities. You may be hired as a cloud data engineer, a data scientist, or a business intelligence analyst. The certification can also be used to advance your current role and take on more responsibilities.

Preparing for the AWS Certified Data Analytics – Specialty Certification Exam

The AWS Certified Data Analytics – Specialty certification exam includes a wide range of topics such as architecting infrastructure, data lake implementation, and machine learning. To prepare for the exam, you can access the AWS Certified Data Analytics – Specialty Exam Guide, which outlines the exam objectives and provides links to study materials such as whitepapers, sample questions, and online training.

You may also choose to take a course to prepare for the exam. ITSkills4U is an AWS training program that offers a comprehensive course on AWS Certified Data Analytics – Specialty. The course is designed to provide a comprehensive understanding of the topics covered on the exam and to prepare students to pass the exam.

KeyCore: Helping You Gain the Skills You Need for AWS Certification

At KeyCore, we provide both professional services and managed services that enable you to understand and leverage the power of the AWS Cloud. With our experts, you can gain the knowledge and experience you need to successfully pass the AWS Certified Data Analytics – Specialty exam. Our consultants can provide support in areas such as data architecture, data modeling, big data, and machine learning. Contact our team of experts to learn more.

Read the full blog posts from AWS

Microsoft Workloads on AWS

How to Upgrade and Optimize Microsoft Workloads on AWS

The End-of-Support Dilemma

Windows Server 2012 and 2012 R2 will reach end-of-support in October 2020. For organizations using these operating systems, this poses a serious problem. If you continue running them in production, you risk security vulnerabilities, poor performance, and lack of compliance.

Fortunately, you have two options when it comes to upgrading your Microsoft workloads on AWS. You can either perform an in-place, manual upgrade, or automate the process with AWS Systems Manager. These options are discussed in more detail below.

Manual Upgrade

The first step in performing a manual upgrade is to deploy Windows Server 2016 or 2019 on AWS. Since the upgrade process can be time-consuming, you can use AWS Launch Wizard to deploy the latest Windows Server to get up and running quickly.

Once the deployment is complete, the manual upgrade process itself involves setting up the destination server environment, setting up the source server environment, creating a backup of the source server, transferring the data from the source server to the destination server, and then finally completing the transition and performing post-migration cleanup.

To help organizations with this process, AWS provides step-by-step guidance in our documentation.

Automated Upgrade with AWS Systems Manager

While manual upgrades can be time-consuming, AWS Systems Manager allows you to automatically upgrade your Windows Server 2012 and 2012 R2 workloads to Windows Server 2016 or 2019. This can significantly reduce the effort and complexity of the upgrade process.

AWS Systems Manager automates the entire upgrade process, from locating and installing the latest server images, to backing up and restoring data, to verifying the upgrade of the server. It allows you to perform these tasks in an automated, repeatable fashion, thereby ensuring consistency and accuracy.

Optimizing Costs

AWS offers significant opportunities to optimize costs when running Microsoft workloads. With AWS’ unique hardware capabilities and cloud architecture, organizations can optimize their cloud costs and save money.

AWS also provides a number of cost optimization benefits, such as Reserved Instances, Spot Instances, and Auto Scaling. These features allow organizations to save money on compute power while ensuring their workloads remain available and performant.

KeyCore Can Help

At KeyCore, we specialize in helping organizations deploy, manage, and optimize their Microsoft workloads on AWS. Our team of experienced AWS experts can help you upgrade your Windows Server 2012 and 2012 R2 workloads, as well as leverage the cost optimization benefits offered by AWS.

Whether you’re looking to upgrade your existing workloads or take advantage of the cost optimization opportunities available through AWS, KeyCore can help. Contact us today to learn more.

Read the full blog posts from AWS

Official Big Data Blog of Amazon Web Services

Best Practices for Exploring Data with Natural Language in Amazon QuickSight

Enable Business Users to Ask Questions about Data Using Everyday Language
Amazon QuickSight is a unified BI service that provides modern, interactive dashboards, natural language querying, paginated reports, machine learning insights, and more. The natural language query function of Amazon QuickSight, Amazon QuickSight Q, enables business users to ask and answer questions about data using their everyday business language.

How to Use Amazon QuickSight Q
To use Amazon QuickSight Q, users must enable it in their QuickSight settings. After doing so, they can create natural language queries for their datasets using plain language instead of SQL or other query languages. QuickSight Q supports most of the standard data types and operators, as well as common data calculations.

Visualize Results
Once a query is made, QuickSight Q builds visualizations and dashboards that provide insight into the query results. Users can edit the visuals and add additional elements to their dashboards, such as charts, tables, and text boxes. They can also save the results and interact with them in real time.

KeyCore Can Help
At KeyCore, we help our clients leverage Amazon QuickSight to unlock the power of their data. Our consultants and managed services experts can help you set up and use QuickSight Q to make the most of your data. We can also help you build custom visualizations and dashboards that provide actionable insights into your data. Contact us today to learn more about how we can help you with Amazon QuickSight.

Read the full blog posts from AWS

AWS Compute Blog

Deploying an automated Amazon CloudWatch dashboard for AWS Outposts and Securing Connectivity from Public to Private

Introduction

AWS Outposts is an AWS managed service that brings the same infrastructure, services, APIs and tools to virtually any data center, colocation space, manufacturing floor, or on-premises facility. Securing connectivity from public to private services often requires the use of a bastion host with a public IP address, which can be a security risk. EC2 Instance Connect Endpoint provides a secure connection from the public internet to private networks within an Amazon Virtual Private Cloud (Amazon VPC).

Deploying an automated Amazon CloudWatch dashboard for AWS Outposts with AWS CDK

AWS Cloud Development Kit (AWS CDK) is an open source software development framework to model and provision your cloud application resources using familiar programming languages. This post will discuss how to deploy an automated Amazon CloudWatch dashboard for your Outposts using AWS CDK.
Firstly, we need to import the necessary libraries to be used in the program. These libraries will allow us to interact with the AWS services to deploy and configure the dashboard. After the libraries have been imported, the AWS CDK is initialized and the resources will be deployed using the AWS CDK.
The dashboard for AWS Outposts consists of three main components. The first is an Amazon CloudWatch Dashboard that will be created with the AWS CDK and configured to display the metrics that you want to track for your Outposts. The second is an Amazon SNS topic which will be used to send notifications when critical metrics are breached. Finally, an Amazon CloudWatch Alarm will be created and configured to trigger the SNS topic when the metrics breach the configured thresholds.

Secure Connectivity from Public to Private with EC2 Instance Connect Endpoint

EC2 Instance Connect Endpoint is a secure connection from the public internet to private networks within an Amazon Virtual Private Cloud (Amazon VPC). EC2 Instance Connect Endpoint makes it easier and more secure to access your Amazon EC2 instances from the public internet by providing a secure connection without needing to use a bastion host with a public IP address. EC2 Instance Connect Endpoint also provides an additional layer of security by requiring authentication for all requests and encryption for all responses.

To enable EC2 Instance Connect Endpoint, you can use the AWS Command Line Interface (CLI) or the AWS SDK for JavaScript. With the AWS CLI, you can create a new endpoint that will be associated with the VPC that your EC2 instance is launched in. The endpoint will have a private IP address and a DNS name that you can use to securely connect to your EC2 instances. With the AWS SDK for JavaScript, you can create an instance connect client and use that to securely connect to your EC2 instances.

Conclusion

AWS Outposts provides a fully managed service that brings the same AWS infrastructure, services, APIs, and tools to virtually any data center, colocation space, manufacturing floor, or on-premises facility. EC2 Instance Connect Endpoint provides a secure connection from the public internet to private networks within an Amazon Virtual Private Cloud (Amazon VPC). Amazon CloudWatch Dashboard, SNS topic, and CloudWatch Alarm can be deployed with AWS CDK to monitor and manage your Outposts.

At KeyCore, our AWS-Certified professionals can help you get the most out of AWS Outposts and EC2 Instance Connect Endpoint. Our team can help you develop the right solutions for your business and provide the necessary guidance to ensure that your cloud environment is secure and running smoothly.

Read the full blog posts from AWS

AWS for M&E Blog

Delay Live Streaming with AWS Elemental MediaPackage

Delaying a live stream broadcast is a key function for many content providers, for various reasons. These could include accommodating different timezones, or even implementing a delay to prevent spoilers. In this article, we’ll explore how to use AWS Elemental MediaPackage to delay a live streaming broadcast.

What is AWS Elemental MediaPackage?

AWS Elemental MediaPackage is a video hosting and streaming platform. It allows users to ingest live streams, encrypt them with DRM, transcode them into multiple bitrates and formats, and package them for delivery to consumer devices. It can also be used to introduce a delay into a live video stream.

Delay Live Streams with AWS Elemental MediaPackage

Using AWS Elemental MediaPackage, you can delay live streams by creating two pipelines and two input endpoints. The first pipeline takes the output from the first input endpoint and passes it to the second endpoint. This pipeline is set to have a delay. The second pipeline takes the output of the second input endpoint and passes it along for delivery.

Getting Started with AWS Elemental MediaPackage

To get started with AWS Elemental MediaPackage, you’ll need to create a pipeline. This pipeline will take your live stream input and output a stream with a delay. To do this, you’ll need to create two input endpoints and two pipelines. First, you’ll create the first pipeline which will take your live stream input and output a stream to the second input endpoint. This pipeline is set to have a delay, which can be configured up to 90 seconds. The second pipeline will take the output of the second input endpoint and pass it along for delivery.

Conclusion and How KeyCore can Help

Introducing a delay into a live or real-time streaming video is a critical function for many content providers, and AWS Elemental MediaPackage is the perfect tool for the job. With it, you can easily delay your streams to accommodate different timezones or prevent spoilers.

If you need help setting up a live streaming system with AWS Elemental MediaPackage, KeyCore is here to help. Our experienced team of AWS certified professionals can provide you with the expertise and guidance you need to get everything set up and running smoothly. We offer both managed and professional services, and can help you get the most out of your streaming setup. Contact us today to learn more about what we can do for you.

Read the full blog posts from AWS

AWS Storage Blog

AWS Storage Blog

When transferring mission-critical data, businesses need secure, scalable solutions that won’t delay operations. Healthcare and life sciences organizations have a particular need for these solutions since they must comply with certain regulations. Amazon Web Services (AWS) offers a variety of storage solutions that help businesses manage and protect their data, deliver it securely, and ensure maximum availability and scalability. In this blog, we explore some of these solutions and how KeyCore can help your business implement them.

Regeneron’s Secure and Scalable File Transfer Service

Regeneron, a healthcare and life sciences organization, was able to build a secure and scalable file transfer service with AWS Transfer Family. AWS Transfer Family provides a variety of protocols that securely and efficiently move data between the cloud and on-premises, and is fully compliant with regulations. Additionally, it has high scalability, allowing for the movement of large files quickly and securely.

Private DNS Support for Amazon S3 with AWS PrivateLink

Compliance requirements often mandate private connections between on-premises applications and cloud storage. To meet these requirements, customers can use AWS PrivateLink over either AWS Direct Connect or AWS Site-to-Site VPN. This ensures that data is transmitted directly to and from AWS without traversing the public internet. AWS PrivateLink also offers private DNS support for Amazon S3.

Veritas Alta Application Resiliency and Amazon EBS

The increased adoption of cloud computing has transformed the way enterprises manage and protect their mission critical applications and data. Veritas Alta Application Resiliency with Amazon Elastic Block Store (EBS) helps enhance data availability and scalability. This solution ensures that critical data remains accessible in case of an interruption, and it also facilitates scalability of applications while maintaining performance and compliance with regulations.

Globe Telecom’s Migration of 7.2 PB of Hadoop Data

Globe Telecom used AWS DataSync at scale to migrate 7.2 PB of Hadoop data. AWS DataSync supports the migration of large datasets with minimal impact on transfer speeds. It also ensures secure and timely data migration, resulting in increased operational efficiency. Additionally, it simplifies the migration process by providing a unified interface for source and destination endpoints.

Finding Public S3 Buckets in Your AWS Account

Data security is a critical business activity. With cloud usage increasing, it’s important to verify that you aren’t inadvertently exposing or sharing data publicly. Under the Shared Responsibility Model, AWS is responsible for protecting the infrastructure that runs AWS services, while customers are responsible for ensuring the security of their data in the cloud. KeyCore can help you find any public S3 buckets in your AWS account.

AWS offers a variety of storage solutions designed to help businesses manage and protect their data, deliver it securely, and ensure maximum availability and scalability. KeyCore can help you implement the AWS storage solutions that best suit your business needs. Contact us today to learn more.

Read the full blog posts from AWS

AWS Architecture Blog

Disaster Recovery for Oracle Database on Amazon EC2 with Fast-Start Failover

High availability is essential for modern organizations to avoid disruptions to business-critical applications. Enterprises must prioritize scalability and availability of their database, network, servers, and storage environments to prevent downtime. For organizations who want to avoid major application changes, Oracle Real Application Clusters (RAC) is an option for providing high availability and scalability to the Oracle database.

How Does Oracle RAC Provide High Availability?

Oracle RAC is an add-on to the Oracle Database that provides high availability, scalability, and performance to your Oracle environment. Oracle RAC works by replicating your Oracle databases across multiple nodes so that if one node fails, the other nodes will take over and continue processing requests. This allows Oracle RAC to provide high availability and scalability to the Oracle database in case of node failure.

Benefits of Disaster Recovery on Amazon EC2

By using Amazon EC2 for disaster recovery, organizations can leverage the cloud’s scalability and reliability to provide a resilient environment for their Oracle RAC databases. With Amazon EC2, organizations can benefit from the ability to quickly scale up or down their compute resources on demand, as well as the ability to replicate their Oracle RAC databases across multiple Availability Zones for enhanced disaster recovery capabilities. Additionally, Amazon EC2 provides organizations with the ability to deploy their Oracle RAC databases in a variety of different operating systems, making it easier to deploy the Oracle RAC databases in different environments.

Fast-Start Failover for Oracle RAC on Amazon EC2

Fast-Start Failover (FSFO) is a feature of Oracle RAC that allows organizations to quickly failover from one node to another in the event of a node failure. With FSFO, organizations can quickly and easily failover from one Oracle RAC node to another in the event of a node failure, minimizing the impact of the node failure on the Oracle environment. Additionally, FSFO also provides organizations with the ability to quickly and easily restore their Oracle RAC database if a node failure occurs.

KeyCore Can Help Implement Disaster Recovery for Oracle Database on Amazon EC2

KeyCore is a leading Danish AWS consultancy that provides professional and managed services for AWS customers. Our team of experienced AWS consultants can help you implement disaster recovery for Oracle Database on Amazon EC2 so that you can take advantage of the cloud’s scalability and reliability. We can help you set up Fast-Start Failover to ensure your Oracle RAC databases are resilient in the event of a node failure. Contact us today to learn more about how KeyCore can help you implement disaster recovery for your Oracle Database on Amazon EC2.

Read the full blog posts from AWS

AWS Partner Network (APN) Blog

Integrating SAML 2.0 Federation into AWS Organization with Azure AD

Many enterprises are eager to streamline identity management by introducing a single identity provider for their multi-cloud approach. This is where having a federated identity using SAML 2.0 authentication might come in handy. To understand the integration of single sign-on with Azure Active Directory (AD), let’s consider a migration project led by Devoteam A Cloud. They presented a client with two options for integrating SAML 2.0 federation into their AWS Organization using Azure AD.

Securing Sensitive Data with Collibra Protect and AWS Lake Formation

Ensuring the security and proper management of sensitive data is a priority for any organization. To meet this challenge, Collibra Protect and AWS Lake Formation offer a powerful combination. Collibra Protect, part of the Collibra Data Intelligence Cloud, provides a secure way of protecting sensitive data while making it available to specified groups of users. Meanwhile, AWS Lake Formation is a fully managed serverless service that simplifies the process of building secure data lakes.

Introducing Industry Blueprints for Data & AI

Introducing industry blueprints is driven by the need to address various challenges in package development for industry solutions. AWS Industry Blueprints for Data & AI (Preview) is an open-source initiative from that offers a collection of building components, including code modules and solution accelerators, to facilitate the configuration and deployment of tailored components for various industry verticals’ turn-data-to-insights needs.

AWS Partner Analytics Dashboard: A 360-Degree View of Your AWS Business

In an effort to transform the partner experience, AWS has introduced the Partner Analytics Dashboard. This feature provides partners with a 360-degree view of their AWS business, including insight into opportunity pipeline, funding benefits, and pipeline. The dashboard also helps to provide transparency to the data needed to have high level strategic conversations, both internally and with AWS teams.

Risk-Based, Fine-Grained Authorization with Transmit Security and Amazon Verified Permissions

When it comes to managing and securing account access, it can feel like navigating a complex landmine of risk. To simplify the process, organizations can take a risk score calculated by Transmit Security and use it as an input to an authorization decision made by Amazon Verified Permissions. This service delivers a pre-built system that simplifies policy-based access control and is flexible enough to address the most advanced authorization requirements for custom applications.

Controlling Access to Amazon API Gateway with CyberArk Identity and Amazon Verified Permissions

Organizations are often looking for ways to secure access to resources by adding logic to make decisions when handling a user request. Amazon Verified Permissions is a managed authorization service that provides a scalable, fine-grained permissions management and authorization service for custom apps. CyberArk Identity works seamlessly as the identity provider with Amazon Verified Permissions, allowing organizations to control access to Amazon API Gateway and ensure least privilege access.

Applying Fine-Grained Authorization to Legacy Apps with Strata Identity Orchestration and Amazon Verified Permissions

Amazon Verified Permissions is a fine-grained authorization service for developers building custom applications, and is a great tool for achieving a zero-trust architecture. Strata Identity’s Maverics Identity Orchestration platform allows customers to simplify cloud migration and modernization projects by augmenting datasets to ensure comprehensive policy enforcement. This makes it easy to apply fine-grained authorization to legacy apps and ensure the highest security standards.

Achieve Faster Growth and Scale with AWS Built-in Partner Solutions

AWS built-in partner solutions are a great way to help customers achieve faster growth and scale. These solutions integrate automatically with AWS foundational services, streamlining the deployment experience and simplifying configuration. What’s more, they also increase security posture by leveraging cloud foundational domains. All of this is available in AWS Marketplace.

Use Matillion Data Loader for Change Data Capture Loading to Amazon Redshift Serverless

Businesses often need to use change data capture (CDC) to quickly and easily load incremental data to data warehouses. Matillion Data Loader can make the process of loading data into Amazon Redshift Serverless easier. This post will walk through an example of CDC loading from PostgreSQL to Amazon Redshift Serverless as the destination.

Streamline Your HIPAA Security Program on AWS with Dash ComplyOps

Healthcare organizations and software providers that build and manage healthcare workloads must formulate the appropriate strategies to establish an effective security and compliance program. Dash ComplyOps and AWS-native services can help businesses automate compliance efforts and build, monitor, and maintain a robust HIPAA security program across AWS cloud environments.

At KeyCore, we have a wealth of experience in setting up and maintaining secure infrastructures and cloud environments. Our team of experts can help your organization design and implement secure, scalable solutions that adhere to industry best practices and HIPAA security standards. Get in touch with us today to learn more about how our team can help you streamline your HIPAA security program on AWS.

Read the full blog posts from AWS

AWS Cloud Enterprise Strategy Blog

Multicloud Strategies: Proven Practices for Your Enterprise

As an Enterprise Strategist, it’s common to find confusion, false certainty, and tentativeness when it comes to multicloud strategies. Companies are presented with conflicting messaging telling them to never adopt a multicloud approach or not miss out because “everyone is switching to multicloud.” But while there are good reasons for pursuing and avoiding multicloud strategies, the question is: How can you determine which approach is best for your organization?

The key is to start by understanding the primary advantages and disadvantages of a multicloud environment. By examining the pros and cons of multicloud, you can make an informed decision about which approach is most suitable for your organization.

What Are the Benefits of Multicloud?

A multicloud environment offers several benefits, including the ability to leverage the best services from different cloud providers. This can be an attractive option for organizations that need services that are available on multiple cloud platforms. Additionally, multicloud strategies allow organizations to optimize their cloud costs by taking advantage of providers with different price points and services.

A multicloud approach also provides organizations with increased security and reliability. With this strategy, organizations can implement multiple layers of security while also mitigating the risk of service outages or downtime. Furthermore, a multicloud strategy ensures that organizations have more control over their data. This allows organizations to be more agile when making changes to their cloud infrastructure.

What Are the Drawbacks of Multicloud?

While multicloud strategies offer many advantages, there are also drawbacks that must be taken into consideration. First, a multicloud approach can be more complex and time-consuming to manage. Organizations must be prepared to invest in additional resources to ensure that the environment is properly configured and managed.

Second, a multicloud strategy requires organizations to maintain multiple cloud providers. This can increase the cost of cloud services and may require organizations to have multiple contracts with different providers. Additionally, it can be difficult to keep track of all the different services and providers in a multicloud environment.

Finally, a multicloud strategy can lead to fragmentation. This can limit the ability of an organization to standardize cloud services and take advantage of best practices across the organization.

How Can KeyCore Help?

KeyCore can help your organization develop and implement a successful multicloud strategy. Our team of experienced AWS experts is adept at helping organizations leverage the best services from different cloud providers. We can help you optimize your cloud costs and take advantage of providers with different price points and services. Our team can also help you maintain your multicloud environment, ensure that it is properly configured, and track all the different services and providers.

With our expert guidance, you can take advantage of the benefits of multicloud while avoiding the drawbacks. Let the experts at KeyCore help you develop a multicloud strategy that works for your organization. Contact us today to get started.

Read the full blog posts from AWS

AWS HPC Blog

Deploy Predictive Models and Simulations at Scale with AWS TwinFlow

AWS TwinFlow is an open source framework that enables users to build and deploy predictive models with heterogenous compute pipelines on AWS. This framework offers a versatile solution to scenarios such as engineering design, scenario analysis, systems analysis, and digital twins. This post outlines the benefits and features of AWS TwinFlow, as well as how KeyCore can help you get the most out of using it.

What is AWS TwinFlow?

AWS TwinFlow is a cloud native, open source framework for building and deploying predictive models with heterogenous compute pipelines on AWS. It simplifies the process of deploying predictive models and simulations at scale by providing a quick, easy to use platform for developers to quickly deploy their models.

Benefits of Using AWS TwinFlow

AWS TwinFlow offers a number of benefits for developers looking to deploy predictive models and simulations. It offers an easy to use platform for deploying predictive models, with no need for manual infrastructure setup. It also supports heterogenous compute pipelines, allowing users to use a variety of different compute types to deploy their models. Finally, it is open source, meaning that developers can customize the framework to their needs.

Features of AWS TwinFlow

AWS TwinFlow offers a variety of features to help developers quickly deploy predictive models and simulations. It includes a suite of tools such as data preprocessing, scaling, and deployment. It also supports cloud-native services such as Amazon SageMaker, Amazon EC2, and Amazon EMR. Additionally, it supports a variety of compute types, such as GPUs, FPGAs, and CPUs.

KeyCore Can Help

KeyCore can help you get the most out of using AWS TwinFlow. Our AWS experts can help you build and deploy predictive models and simulations with AWS TwinFlow quickly and easily. We can also work with you to customize the framework to meet your specific needs. Contact us today to learn more about how we can help you get the most out of using AWS TwinFlow.

Read the full blog posts from AWS

AWS Cloud Operations & Migrations Blog

Announcing Live Tail, Customizations, Access Privilege Reports, Patch Baselines, Lake Dashboards, Profiles, and Recording Exclusions with AWS

Live Tail Feature for Amazon CloudWatch Logs

Amazon CloudWatch Logs has just released a new feature called Live Tail that makes it easier to monitor log streams. This feature allows customers to view log streams as they are generated and can help them quickly identify issues that need attention. Customers can view the log stream content for up to 10 minutes in the past and can also set up alerting when certain thresholds are exceeded.

Simplify Infrastructure Deployments with Customizations for AWS Control Tower and AWS Serverless Application Model

Customers want more flexibility and simpler ways to manage their AWS accounts. AWS offers several options to customize their account deployments, such as the Account Factory Customization (AFC) native to AWS Control Tower, or Customizations for Control Tower (CfCT). This blog post looks at CfCT, which allows customers to customize their account deployments across multiple accounts with automated deployments.

Generate User Access Privilege Reports with AWS Audit Manager

Meeting compliance regulations is an important part of any cloud infrastructure. A key element of compliance is producing a user privilege and access report. Auditors use these reports to make sure permissions are locked down at a granular level. AWS Audit Manager helps customers assess and audit their AWS account and generate the necessary user privilege and access reports. This helps customers ensure their accounts meet their compliance requirements.

Automate Updating Approval Cut Off Dates for Patch Manager Patch Baselines

AWS Systems Manager Patch Manager makes it easier for customers to patch their Linux and Windows managed nodes in AWS and hybrid environments. Patch baselines are a major part of the Patch Manager, and allow customers to control which patches are approved or rejected during installation. The approval rules in these patch baselines have a parameter called the Auto Approval Cut Off Date, which can be automated to update regularly.

Announcing AWS CloudTrail Lake Dashboards

In January 2022, AWS announced the general availability of CloudTrail Lake, which allows customers to store and query their activity logs for auditing, security investigation and operational troubleshooting. To make this easier, AWS has released CloudTrail Lake dashboards that give customers visual insights into their CloudTrail data and allow them to drill down into the data to identify any issues.

Prioritize Business-Critical Needs with the Profiles Feature in the AWS Well-Architected Tool

The AWS Well-Architected Framework is a collection of best practices that helps cloud architects build and operate secure, high-performing, resilient, and efficient infrastructure for their applications. The Well-Architected Framework Review (WAFR) allows organizations to measure their workloads against best practices. The Profiles feature in the WAFR allows customers to prioritize their business-critical needs and is a valuable tool for organizations to ensure their infrastructure is secure and optimized.

Announcing AWS Config Now Supports Recording Exclusions by Resource Type

AWS Config is a service that tracks configuration changes of AWS resources. It uses the configuration recorder to detect changes and capture them as configuration items. By default, the configuration recorder records all changes, but customers can now set it up to exclude certain resource types from being recorded. This makes it easier for customers to create more specific views of their AWS resources.

At KeyCore, our team of AWS certified experts help customers ensure their cloud deployments are secure, efficient, and compliant. We can assist customers with implementation of the AWS Well-Architected Framework to identify and resolve issues, and with setting up AWS Config to track their cloud infrastructure changes. Don’t hesitate to get in touch with us to find out more about how we can help.

Read the full blog posts from AWS

AWS for Industries

AWS for Industries

Track customer traffic in aisles and cash counters using Computer Vision

The retail industry has seen dramatic changes in the last two decades, with the rise of ecommerce, digital promotions, and targeted marketing. To keep up with customer expectations, technology has been especially important. To give customers personalized experiences, reduce costs, and increase sustainability, Computer Vision is being used in the retail industry in a variety of ways.

Computer Vision technology can be used to track customer traffic in aisles and cash counters. This helps retailers understand customer behavior and preferences, and track sales patterns, giving retailers the insight to optimize their stores and maximize profits. It also helps them to create more efficient layouts, identify bottleneck areas, and increase safety and security by managing the number of customers in a store at any given time.

Computer Vision can also be used to monitor shelf stocks, detect out-of-stock items, and ensure that shelves are correctly stocked with the right items. This helps to reduce losses due to theft or misplacement of items, and improve customer satisfaction. Additionally, it can be used to track customer interactions with products, such as how long a customer spends looking at a product, whether they pick up a product, or what product combinations they are interested in.

Generative AI for Telcos: taking customer experience and productivity to the next level

According to a recent survey, 21% of CEOs have named Artificial Intelligence as the top disruptive technology that will impact their industry in the next three years. Telcos are no exception in recognizing AI’s immense potential, with the potential to take customer experience and productivity to the next level.

AI-driven technologies such as conversational AI, natural language processing, and machine learning can be used to provide faster and more efficient customer service. This, in turn, can help telcos increase customer satisfaction and loyalty, as well as reduce operational costs. AI-driven technologies can also be used to automate network operations, reducing manual errors and freeing up human resources for more strategic tasks.

AI can also be used to develop more powerful and innovative applications. For example, telcos can use AI to develop applications that use predictive analytics to anticipate customer needs and provide tailored offers. This can help telcos gain a competitive edge in the market and improve customer experience.

Telco Workload Placement and Configuration Updates with Lambda and inotify

To meet performance requirements, mobile service providers often need to deploy mobile Container Network Functions (CNFs) closer to the traffic location. Lambda and inotify can be used to easily and quickly place and configure these CNFs. With Lambda, mobile service providers can quickly and efficiently deploy CNFs in response to network demands.

Lambda functions can be triggered from inotify events, allowing for rapid response to changes in the network. This helps mobile service providers optimize their network performance and keep up with the growing demands of users and applications. Additionally, Lambda functions can be used to automate configuration updates, increasing efficiency and reducing manual errors.

The US Treasury Report on Cloud Adoption in Financial Services and how AWS is supporting customers

The US Department of the Treasury (UST) released a report earlier this year on the Financial Services Sector’s Adoption of Cloud Services, based on discussions held with AWS and other financial services and technology organizations. The report highlights the many benefits FIs can experience when using cloud services, such as reduced costs, improved customer experience, and greater scalability.

AWS is committed to helping financial services organizations meet their cloud adoption goals. AWS provides a comprehensive suite of services, such as Amazon SageMaker, Amazon Rekognition, AWS Fargate, and AWS Lambda, that can help FIs develop and deliver innovative customer experiences. Additionally, AWS offers its customers a variety of compliance and security services, such as Amazon GuardDuty, AWS Config, and AWS Identity and Access Management.

How AWS can help you adapt to new regulatory draft guidance for use of learning AI in medical devices

The Food and Drug Administration (FDA) recently released draft guidance that opens the door for medical devices with AI/ML to learn and improve after the product comes to market. This draft guidance is designed to help manufacturers of medical devices get their products approved or cleared by the FDA.

AWS can help medical device manufacturers adapt to this new regulatory framework. AWS provides AI/ML services such as Amazon SageMaker, Amazon Rekognition, and Amazon Comprehend that can be used to develop, train, and deploy ML models. Additionally, AWS provides a range of services that can help medical device manufacturers comply with the FDA’s regulations, such as AWS Security Hub, Amazon Inspector, and AWS CloudTrail.

How DeviceAtlas optimized Real-Time Advertising Price/Performance on AWS Graviton3

Real-time bidding (RTB) is a unique advertising process that requires web pages to be loaded in under 100 milliseconds. DeviceAtlas was able to optimize RTB price/performance with the help of AWS Graviton3. Graviton3 provides the compute power and memory needed to quickly process RTB ad requests, allowing DeviceAtlas to deliver ad impressions more quickly and with less latency.

DeviceAtlas also used Amazon ElastiCache to reduce the cost of RTB by caching the ads in memory. This reduces the latency associated with retrieving the ads from an external data source and eliminates the need for costly database calls. Additionally, DeviceAtlas used Amazon CloudFront and AWS Lambda to quickly and securely deliver the ads.

CPG Partner Conversations: Peak helps harness AI to drive efficiency and growth

Consumer packaged goods (CPG) companies have the opportunity to use AI to stay ahead of today’s competitive market. AI can be used to dramatically impact the CPG industry in key areas such as demand forecasting, inventory planning and optimization, and pricing optimization.

AWS offers a range of services that can help CPG companies harness AI to drive efficiency and growth. These services include Amazon SageMaker for building and training ML models, Amazon Rekognition for image analysis, Amazon Comprehend for text analysis, Amazon Textract for document analysis, and Amazon Forecast for demand forecasting. Additionally, AWS provides services such as Amazon SageMaker for operationalizing ML models, Amazon Personalize for personalizing customer experiences, and Amazon Fraud Detector for fraud detection.

Accelerating code to road with cloud workflows and Automotive OS

Original Equipment Manufacturers (OEMs) need to accelerate their in-vehicle software delivery utilizing newer approaches and solutions such as cloud-based workflows and automotive operating systems (OS). To make this possible, AWS provides a range of services such as Amazon SageMaker for developing ML models, Amazon S3 for storing data, Amazon EC2 for running applications, and Amazon SageMaker Neo for optimizing models. Additionally, AWS provides services such as Amazon Elastic Container Service (ECS) for managing containerized applications, Amazon Elastic Kubernetes Service (EKS) for running containerized applications on Kubernetes, and Amazon EventBridge for event-driven computing.

Read the full blog posts from AWS

AWS Messaging & Targeting Blog

Maximizing the Email Delivery of Your Business with Amazon SES and Geofencing

Why Monitor Amazon SES Bounces and Complaints?

Amazon Simple Email Service (Amazon SES) is a cloud email service provider that is cost-effective and flexible. SES allows businesses and individuals to send bulk emails to their customers and subscribers. While SES can be a powerful email tool, monitoring your bounces and complaints is essential for optimizing your email delivery performance.

Bounces are emails that can’t be delivered to the recipient’s email address, either due to an incorrect address or an issue with the recipient’s email server. Bounces can be either hard (permanent) or soft (temporary). Hard bounces should be removed from your mailing list to ensure that your emails reach as many recipients as possible.

Complaints are emails that have been marked as spam. By monitoring complaints, you can identify any issues with your content and address them to ensure that your emails are delivered to the inbox.

Using Geofencing to Send Targeted Marketing Messages with Amazon Pinpoint

Geofencing is a technology that enables businesses to target customers with marketing messages based on their location. Geofencing creates a virtual boundary that triggers marketing messages to a mobile device when a user enters or exits the boundary. By setting up geofenced marketing messages, businesses can increase conversions by driving more traffic to their store or website.

Amazon Pinpoint is a powerful multichannel communication tool that can be used to create mobile notifications with geofencing technology. Pinpoint allows businesses to create a geofence by specifying a latitude and longitude coordinate, and also to define the radius of the geofence. When a user enters the geofence, Amazon Pinpoint will trigger a notification to the user’s device.

KeyCore Can Help You Optimize Your AWS Messaging and Targeting

At KeyCore, we provide both professional and managed AWS services that can help businesses optimize their messaging and targeting using Amazon SES and Amazon Pinpoint. Our team of experienced AWS professionals can help you set up notifications for bounces and complaints in Amazon SES, and also create geofenced marketing messages using Amazon Pinpoint.

We are experts at using AWS messaging and targeting services to help businesses maximize their email delivery performance and reach their customers. Contact us today to learn more about how KeyCore can help you optimize your messaging and targeting.

Read the full blog posts from AWS

AWS Marketplace

Perform Quant Research at Scale Using AWS Marketplace Solutions

In this blog post, Alex, Pramod, and the author will discuss how to install and use an infrastructure to perform quant research at scale. The stack and examples to use the infrastructure are available in a public repository, and this solution uses Apache Spark, Amazon EMR on EKS, Docker, Karpenter, EMR Studio Notebooks, and AWS Data Exchange for Amazon S3.

Resource Sharing Across AWS Accounts

The author will then discuss how to use AWS Resource Access Manager and AWS Marketplace Catalog APIs to share the catalog resources across AWS accounts.

Accelerate Healthcare Transformation with Cloud Solutions

AWS Marketplace recently held a webinar titled Accelerate healthcare transformation with cloud solutions. Leaders from Hyland, Prognos Health, AdaptX, and the author discussed the research findings during this webinar. This post is a summary of that conversation.

Update Single-AMI Products in AWS Marketplace Using Self-Service

AWS Marketplace now offers sellers, independent software vendors (ISVs), and consulting partners (CPs) the ability to update single-AMI products using self-service. In this blog post, the author will show how to use the self-service feature to update the different features of single-AMI products listed in AWS Marketplace.

How KeyCore Can Help

KeyCore provides professional services and managed services to help businesses use AWS Marketplace solutions. Our experts understand the complexities of these solutions and can help businesses get the most out of them. We also provide advice on which solutions are the best fit for specific needs and provide the necessary technical support and maintenance. Contact KeyCore today to find out how we can help you make the most of AWS Marketplace solutions.

Read the full blog posts from AWS

The latest AWS security, identity, and compliance launches, announcements, and how-to posts.

The Latest AWS Security, Identity, and Compliance Launches

At Amazon Web Services (AWS), security, identity, and compliance are top priorities. To keep up with the demand for greater security and compliance, AWS is constantly releasing new features and updates.

Removing Header Remapping from Amazon API Gateway

AWS rarely makes disruptive changes or removes functionality from production services as customers use the AWS Cloud to build solutions for their customers. However, to ensure that customer APIs and service functionality are secure, AWS has recently removed header remapping from Amazon API Gateway. As part of this process, AWS has been working with security researchers to ensure all customer data is kept safe.

Simplifying Authorization with Amazon Verified Permissions and Amazon Cognito

Amazon Cognito is already used by customers for quick and easy authentication. The launch of Amazon Verified Permissions allows customers to add authorization to their applications by using user attributes stored in Amazon Cognito. This combination of Amazon Cognito and Verified Permissions simplifies the process of authorizing user actions, helping to make applications more secure.

AWS WAF Fraud Control – Account Creation Fraud Prevention

AWS WAF Fraud Control helps to protect applications from account creation fraud, such as using promotional and sign-up bonuses, publishing fake reviews, and spreading malware. Released in 2022, AWS WAF Fraud Control – Account Takeover Prevention can detect and prevent credential stuffing attacks, brute force attempts, and other account fraud.

Automating Findings Updating with AWS Security Hub

As organizations scale and consume the benefits of the cloud, it is important to factor in and understand how the additional cloud footprint will affect operations. To make operations more efficient, AWS Security Hub has launched a new capability for automating actions to update findings. This can help organizations drastically reduce the time needed to investigate and remediate security issues.

Post-Quantum Hybrid SFTP File Transfers Using AWS Transfer Family

To provide long-term protection of encrypted data, AWS has been introducing quantum-resistant key exchange in common transport protocols used by AWS customers. With the launch of post-quantum hybrid key exchange using Kyber at the National Institute of Standards and Technology (NIST), AWS Transfer Family can now use SFTP for post-quantum hybrid file transfers.

At KeyCore our team of AWS professionals can help you get the most out of AWS security, identity, and compliance solutions. Our experience with the latest AWS launches and how-to posts can help your organization stay ahead of the curve when it comes to security and compliance. Contact KeyCore today to learn more about how we can help secure your organization.

Read the full blog posts from AWS

AWS Startups Blog

AarogyaAI is using AI/ML on AWS to Precisely Diagnose Antimicrobial Resistance

AarogyaAI, a healthcare and life sciences startup, is using artificial intelligence and machine learning (AI/ML) on AWS to rapidly diagnose drug resistance in patients caused by bacterial, fungal, and viral pathogens. This allows clinicians to make data-driven treatment decisions and prescribe drugs that effectively treat and increase health outcomes for patients.

How AI/ML is Used for Diagnosis

AarogyaAI uses a combination of AI/ML techniques such as natural language processing, image recognition, and machine learning algorithms to accurately identify and diagnose antimicrobial resistance. With this, they can rapidly diagnose drug resistance in patients and increase the accuracy of diagnosis and treatment. The AI/ML models are backed by AWS infrastructure, allowing them to scale quickly and efficiently.

Benefits of AI/ML for Diagnosis

AI/ML techniques enable AarogyaAI to provide accurate and reliable diagnosis of antimicrobial resistance. This helps clinicians and healthcare providers make data-driven treatment decisions quickly and easily. By leveraging AWS infrastructure, they can scale quickly and easily to meet the demands of the healthcare industry. Additionally, the AI/ML models can be used to predict drug interactions with greater accuracy than traditional methods, which can improve patient outcomes.

KeyCore to the Rescue

At KeyCore, we provide both professional services and managed services for AI/ML-based solutions. Our team of experienced AWS engineers can help you design, build, and deploy AI/ML solutions on AWS to improve healthcare outcomes. We can provide guidance and expertise with AI/ML best practices, AWS architecture, and DevOps automation to ensure your AI/ML solution is running smoothly. Contact us today to learn more about how we can help.

Read the full blog posts from AWS

Business Productivity

Adding Real-Time Call Analytics to Voice Calls

Real-time call analytics can be used to help businesses better understand their customer’s preferences and behaviors. To help developers learn how to use Amazon Chime SDK real-time call analytics, Amazon has created a hands-on workshop in the AWS Workshop Studio.

The AWS Workshop Studio

The AWS Workshop Studio is a collection of self-paced tutorials designed to teach practical skills and techniques to solve business problems with AWS. The workshops are based on exercises, real-world scenarios, and best practices, and provide hands-on guidance for developers.

Amazon Chime SDK’s Real-Time Call Analytics

The workshop in the AWS Workshop Studio provides guidance on using Amazon Chime SDK’s real-time call analytics. It will help developers understand how to leverage the Chime SDK to track, visualize, and analyze customer interactions in real-time.

Benefits of Real-Time Call Analytics

Real-time call analytics can provide valuable insights to businesses. It can help them better understand customer preferences, behaviors, and interactions, as well as gain more detailed reporting for call analytics. The Amazon Chime SDK real-time call analytics workshop can help businesses leverage these capabilities for their applications.

How KeyCore Can Help

KeyCore is the leading Danish AWS consultancy, providing both professional and managed services. Our team of AWS experts can help you understand the capabilities of the Amazon Chime SDK and help you set up the workshop to make sure you can take advantage of the real-time call analytics. Contact us today to learn more.

Read the full blog posts from AWS

Front-End Web & Mobile

Next.js and AWS Amplify: Unlocking Powerful Features

Next.js is a popular React framework that enables server-side rendering and static site generation for React apps. When combined with AWS Amplify, a set of purpose-built tools and features that enables frontend web and mobile developers to quickly and easily build full-stack applications on AWS, developers can build powerful applications.

5 Features of Next.js and AWS Amplify

Here are 5 features of Next.js and AWS Amplify that developers can leverage to create powerful applications:

1. Flexible Workflows

AWS Amplify provides developers with a powerful set of tools to create complex workflows and custom logic for their applications. With the tools provided by AWS Amplify, developers can create custom workflows that include components such as authentication, authorization, data analysis, and more. This allows developers to quickly build applications with complex logic.

2. Improved Data Access

The combination of Next.js and AWS Amplify enables developers to access and manipulate data from multiple sources in a single application. With the AWS Amplify DataStore, developers can access and manage data from multiple sources such as databases, NoSQL, and cloud services. This allows developers to easily access and manage data from multiple sources in a single application.

3. Enhanced Security

AWS Amplify provides developers with enhanced security features that help protect their applications from malicious actors. With the AWS Identity and Access Management (IAM) service, developers can control who has access to their application and what they can do with it. AWS Amplify also provides developers with a wide range of security features such as encryption, authentication, and authorization.

4. Develop Faster with Pre-Built Components

With AWS Amplify, developers can quickly create applications using pre-built components. These components include components such as authentication, authorization, and data models. This allows developers to quickly create applications without having to write all of the code from scratch.

5. Build Scalable Solutions

AWS Amplify enables developers to create scalable applications that can handle large amounts of traffic and data. With the AWS Amplify CLI, developers can easily scale up their applications to handle more requests and more data. This enables developers to quickly build applications that can handle large amounts of traffic and data.

KeyCore and AWS Amplify

At KeyCore, we provide professional and managed services to help customers build, maintain, and optimize applications on AWS. We are experienced in using AWS Amplify and Next.js to create powerful applications that meet customer requirements. Our team of experts can help customers build applications with the features and flexibility they need to succeed.

Read the full blog posts from AWS

AWS Contact Center

How Amazon Connect Contact Lens Improves Agent Performance

Agents play a major role in a company’s customer service strategy. They are the primary point of contact for customers when they have questions, comments, or complaints. The quality of customer service that agents provide will have a huge impact on customer satisfaction and loyalty. While all agents are trained to provide excellent customer service, not all agents have the same skillset. Amazon Connect Contact Lens is a tool that helps to improve agent performance so that they can best serve their customers.

What is Amazon Connect Contact Lens?

Amazon Connect Contact Lens is an AI-powered tool that is part of the Amazon Connect suite. It uses natural language processing (NLP) and machine learning (ML) to analyze customer conversations. Contact Lens analyzes the spoken interactions between agents and customers to provide insights into customer experience, agent performance, customer sentiment, and more. It also provides coaching to agents so they can improve their customer service skills.

How Does Contact Lens Improve Agent Performance?

Contact Lens provides a range of features to help agents improve their performance. It can detect customer sentiment and identify trends in customer conversations. This allows agents to respond to customer queries more effectively and provide better customer service. Contact Lens can also provide feedback to agents on their performance. This can help agents to identify areas where they can improve their customer service skills.

KeyCore’s Expertise in Amazon Connect

KeyCore is the leading Danish AWS consultancy. We provide both professional services and managed services related to Amazon Connect. With our expertise, we can help you to get the most out of Amazon Connect Contact Lens. We can assist you with setting up Contact Lens, provide training to your agents, and help to better understand your customer conversations. If you need help with your Amazon Connect Contact Lens implementation, contact us today to learn more.

Read the full blog posts from AWS

Innovating in the Public Sector

Innovating in the Public Sector

Public sector organizations have a unique set of challenges when it comes to data management and access. Nonprofits, in particular, often have to deal with a large amount of data that is stored in disparate systems, making it difficult to unify and take advantage of all the available data. At the same time, new technologies are making it possible for public sector organizations to innovate and increase the level of care and services they are able to provide. In this article, we will look at how public sector organizations can use AWS to innovate in the healthcare space.

Unifying Nonprofit Healthcare Data with the Collibra Data Intelligence Cloud

AWS Partner Collibra has developed the Collibra Data Intelligence Cloud, which is designed to unify access to disparate datasets for Nonprofit organizations in the healthcare industry. Froedtert and the Medical College of Wisconsin (F&MCW) are using the Collibra Data Intelligence Cloud on AWS, helping them to gain a unified view of their data, enabling them to make more informed decisions and improve patient outcomes. Using AWS, Collibra is also able to provide scalability and flexibility to their customers, helping them to quickly and reliably process large amounts of healthcare data.

The Elastic Health Record (EHR)

Cloud technology is revolutionizing the way healthcare organizations manage patient records. With the access to cloud-based storage and compute, healthcare leaders can now shift the underlying words of EHR from “electronic” to “elastic,” embracing a system of records that is constantly in motion. AWS can help support this approach, providing healthcare organizations with the scalability and flexibility they need to quickly and reliably access and process healthcare data.

National University and ConcernCenter

National University, a California-based university that serves a high proportion of military and adult learners, needed to provide scalable and customizable mental health support to their students during the COVID-19 pandemic. To do this, they worked with AWS Partner ConcernCenter to develop a solution that could meet their needs. ConcernCenter was able to provide NU with a customizable solution that could process large amounts of data quickly and reliably. The ability to quickly provision and scale their solution on AWS allowed NU to provide the mental health support they needed to their students, even in the face of the pandemic.

How AWS Can Help Public Sector Organizations Innovate in Healthcare

AWS provides all the tools and resources public sector organizations need to innovate in healthcare, from access to large amounts of data processing power, to the ability to quickly scale and customize solutions. With AWS, organizations in the healthcare space can gain access to new technologies that can help them improve patient outcomes and provide better, more tailored services.

Making the Most of the AWS Cloud with KeyCore

KeyCore is the leading Danish AWS consultancy, offering both professional services and managed services. Our team of AWS experts can help public sector organizations make the most of the AWS cloud to innovate in healthcare. We provide advanced technical services and advice on how to leverage the tools and services provided by AWS to their fullest. With our expertise, we can help organizations design, build, and deploy solutions that can give them the edge they need to succeed. To learn more about KeyCore and our services, please visit our website at https://www.keycore.dk.

Read the full blog posts from AWS

The Internet of Things on AWS – Official Blog

Introducing the Internet of Things on AWS

Importing Historical Equipment Data into AWS IoT SiteWise

AWS IoT SiteWise is a managed service that helps customers collect, store, organize, and monitor data from their industrial equipment at scale. With this service, customers can bring their historical equipment measurement data from existing systems like data historians and time series databases into AWS IoT SiteWise. This ensures data continuity, enables training of artificial intelligence (AI) models, and facilitates better operational insights from the data.

AWS IoT SiteWise is a secure and reliable platform to store data from industrial equipment. Data from different sources can be aggregated, organized, and analyzed with the help of its scalable compute and storage capabilities. It also enables device data to be globally accessible and allows customers to set up alarms and metrics for monitoring their equipment.

Introducing Support for Public Networks with AWS IoT Core for LoRaWAN

AWS recently announced a preview of public network support for AWS IoT Core for LoRaWAN. This is a fully managed LoRaWAN Network Server (LNS) that helps customers deploy their own LoRaWAN network. AWS has partnered with Everynet, a public LoRaWAN network provider, to simplify LoRaWAN network deployment and provide customers with an alternative to managing their own gateways. Everynet’s national LoRaWAN network enables customers to connect the most commonly used LoRaWAN devices and sensors, with added peace of mind that their data is backed up securely in AWS.

With AWS IoT Core for LoRaWAN, customers can deploy and manage LoRaWAN networks with a few clicks. It also provides an easy-to-use dashboard with powerful analytics capabilities to manage and monitor their devices. Additionally, customers can configure and manage their LoRaWAN network remotely, ensuring they are always connected to their devices.

How KeyCore Can Help

At KeyCore, we provide professional services and managed services to help customers better utilize the Internet of Things on AWS. Our team of experts are familiar with the features available with AWS IoT Core for LoRaWAN and AWS IoT SiteWise and can help customers configure, deploy, and maintain their IoT solution. We understand the complexities of managing large-scale IoT deployments and can help customers every step of the way to ensure their success.

With our expertise and experience in the Internet of Things on AWS, we can help customers get the most out of their IoT infrastructure and achieve their goals. Let us help you get started today.

Read the full blog posts from AWS

AWS Open Source Blog

Simplify Amazon EKS Multi-Cluster Authentication with Open Source Pinniped

Streamlining Multi-Cluster Authentication

Authenticating multiple clusters for Amazon Elastic Kubernetes Service (EKS) can be a complex process. It often involves the manual setup of IAM roles, authentication tokens, and Kubernetes configuration files. To simplify this process, open source Pinniped and Okta can be used as an identity provider.

How Does Open Source Pinniped Work?

Open source Pinniped is an open source library designed to facilitate authentication on Kubernetes clusters. It is built upon the Okta authorization API, which allows users to securely authenticate using a username and password. Pinniped stores authentication tokens in memory and provides a set of APIs to access and manage these tokens. It also provides a web-based UI for users to manage their tokens and configure their Kubernetes clusters.

Using Open Source Pinniped to Connect Multiple Clusters

To connect multiple clusters, users must first install the Okta authorization API on each cluster. Once installed, users can use Pinniped to authenticate against each cluster. Once authenticated, users can then configure the cluster to use Okta as the identity provider. This will enable users to securely access and manage their clusters from a single location.

Advantages of Using Open Source Pinniped

Using open source Pinniped and Okta to manage authentication for EKS multi-cluster deployments has several advantages. First, it simplifies the authentication process by eliminating the need to manually configure IAM roles, authentication tokens, and Kubernetes configuration files. Second, it allows users to manage their clusters from a single location. Finally, it provides users with a secure and reliable way to authenticate against their clusters.

Maximize the Benefits of Open Source Pinniped with KeyCore

KeyCore specializes in providing professional services and managed services across a variety of technologies, including AWS. Our team of experts can help you maximize the benefits of open source Pinniped and Okta by streamlining the authentication process for your EKS multi-cluster deployments. Contact us for a free consultation to learn more about how KeyCore can help you securely manage your clusters.

Read the full blog posts from AWS

Scroll to Top