Summary of AWS blogs for the week of monday Mon Oct 16
In the week of Mon Oct 16 2023 AWS published 105 blog posts – here is an overview of what happened.
Topics Covered
- AWS DevOps Blog
- AWS for SAP
- Official Machine Learning Blog of AWS
- Announcements, Updates, and Launches
- Containers
- AWS Quantum Technologies Blog
- AWS Smart Business Blog
- Official Database Blog of AWS
- AWS Cloud Financial Management
- AWS for Games Blog
- AWS Training and Certification Blog
- Microsoft Workloads on AWS
- Official Big Data Blog of AWS
- AWS Compute Blog
- AWS for M&E Blog
- AWS Storage Blog
- AWS Developer Tools Blog
- AWS Partner Network (APN) Blog
- AWS Cloud Enterprise Strategy Blog
- AWS HPC Blog
- AWS Cloud Operations & Migrations Blog
- AWS for Industries
- AWS Marketplace
- The latest AWS security, identity, and compliance launches, announcements, and how-to posts.
- Front-End Web & Mobile
- Innovating in the Public Sector
- The Internet of Things on AWS – Official Blog
AWS DevOps Blog
AWS CodeDeploy Now Supports Multiple Load Balancers
AWS CodeDeploy is a fully managed deployment service that automates software deployments to a variety of compute services, like Amazon EC2, ECS, Lambda, and on-premise servers. Recently, CodeDeploy announced its support for applications that use multiple AWS Elastic Load Balancers (ELB).
Using CodeDeploy, it is now possible to deploy to applications that use multiple ELBs, which can help with better orchestration of workloads. For example, this could be used for blue-green deployments, where the existing, live version of an application is still used while the new version is tested or rolled out to the wider user base.
Another key benefit to this feature is that it enables CodeDeploy to deploy to applications that are using a mixture of classic and application load balancers. This allows for a more seamless transition between the two types of load balancers, as well as the potential for cost savings, since the classic load balancers are cheaper.
Introducing the AWS Well-Architected Framework DevOps Guidance
Today, Amazon Web Services (AWS) announced the launch of the AWS Well-Architected Framework DevOps Guidance. This guidance introduces the AWS DevOps Sagas, which is a collection of modern capabilities that together form an approach to designing, developing, securing, and operating software with scalability in mind.
This approach is based on Amazon’s own transformation journey and customer feedback. It provides guidance on how to plan, build, and operate applications with DevOps practices. This guidance encourages developers to focus on priorities such as scalability, security, and performance.
Moreover, the AWS Well-Architected Framework DevOps Guidance provides resources to help customers identify areas of improvement with their existing applications. This can help them reduce operational costs and improve their software’s reliability.
At KeyCore, we understand the importance of using AWS DevOps practices for building reliable and resilient applications. Our team of experienced AWS DevOps consultants can provide expertise and support to help customers make the most of the AWS Well-Architected Framework DevOps Guidance. We can help customers develop and implement the best DevOps strategies for their applications, and ensure they are taking the necessary steps to ensure the security and performance of their applications.
Read the full blog posts from AWS
- Multiple Load Balancer Support in AWS CodeDeploy
- Announcing the AWS Well-Architected Framework DevOps Guidance
AWS for SAP
Extend RISE with SAP on AWS with Analytics Fabric for SAP Accelerators
SAP customers are increasingly turning to Amazon Web Services (AWS) to extend their RISE with SAP solutions. The Analytics Fabric for SAP Accelerators is designed to enable customers to quickly and easily access AWS services that are tailored to support customer’s SAP workloads. In this post, we’ll take a closer look at how the AWS Analytics Fabric for SAP Accelerators can help customers accelerate their SAP deployments.
What is the AWS Analytics Fabric for SAP Accelerators?
The AWS Analytics Fabric for SAP Accelerators is a collection of preconfigured, best-practice solutions that can be used to quickly set up and deploy SAP workloads on AWS. The fabric includes a selection of popular AWS services such as Amazon EC2, Amazon EMR, Amazon S3, Amazon Athena, and Amazon Redshift, as well as other services tailored to SAP workloads, such as Amazon Aurora, Amazon RDS, Amazon VPC, and Amazon Kinesis. The fabric is designed to make it easier and faster for customers to get up and running on AWS, and to ensure a consistent and optimized experience when deploying SAP workloads.
Key Benefits of the AWS Analytics Fabric for SAP Accelerators
The AWS Analytics Fabric for SAP Accelerators offers a number of key benefits to customers, including:
- Faster Deployment: The pre-configured solutions provided by the fabric can be used to quickly and easily set up and deploy SAP workloads on AWS.
- Optimized Performance: The fabric is designed to ensure a consistent and optimized experience when deploying SAP workloads, allowing customers to quickly identify and address potential performance issues.
- Reduced Complexity: The fabric makes it easier and faster for customers to get up and running on AWS, reducing the complexity of the deployment process.
How KeyCore Can Help
KeyCore provides a range of services to help customers deploy and optimize their SAP workloads on AWS. Our team of certified AWS professionals can help customers understand best practices for deploying and managing their SAP workloads on AWS, and can help customers unlock the full potential of the AWS Analytics Fabric for SAP Accelerators. Our team can also provide custom solutions tailored to customer’s specific needs and requirements, ensuring an optimal experience when deploying SAP workloads on AWS.
Read the full blog posts from AWS
Official Machine Learning Blog of Amazon Web Services
Amazon Machine Learning Blog – An Overview
Customers around the world are beginning to use machine learning (ML) in their products and services. This is made possible on AWS through Amazon SageMaker, Amazon Rekognition, and other services. The Official Machine Learning Blog of Amazon Web Services provides in-depth guidance on how to use these services, along with case studies and stories from customers who are innovating with ML. This overview will provide a brief look at the content available on the blog.
Governing the ML Lifecycle at Scale
The Official Machine Learning Blog of Amazon Web Services offers a series of posts on governing the ML lifecycle at scale. The first part of the series outlines a framework for architecting ML workloads using Amazon SageMaker. It discusses the challenges customers face when implementing ML while also outlining the value that ML can offer. The second part of the series focuses on the secure and private management of ML data, explaining why it’s important and how customers can achieve it.
Case Studies & Product Feature Highlights
The blog also contains case studies from customers who have achieved success with ML on AWS. For example, Meesho, an ecommerce company in India, used Amazon SageMaker to create a generalized feed ranker. This ranker was able to improve the efficiency of their product search results, allowing them to save time and money.
The blog also explains how companies such as Purina US can leverage Amazon Rekognition Custom Labels to optimize pet profiles for their Petfinder application. With this technology, Purina US can quickly and accurately detect defects in high-resolution imagery, while Amazon SageMaker Data Wrangler makes it easier to automatically redact PII for ML. Additionally, Amazon Pharmacy has used Amazon SageMaker to create an LLM-based chatbot, and Amazon Personalize and Amazon OpenSearch Service integration can be used for personalized search results.
How KeyCore Can Help
At KeyCore, we understand the importance of leveraging machine learning for business success. Our team of AWS experts can help you architect your ML workloads, securely manage your ML data, and take advantage of AWS services such as Amazon SageMaker and Amazon Rekognition. We can help you accurately detect defects in high-resolution imagery, integrate Amazon Personalize with Amazon OpenSearch Service, and create an LLM-based chatbot. Contact us today to learn more about how we can help you succeed with AWS and ML.
Read the full blog posts from AWS
- Governing the ML lifecycle at scale, Part 1: A framework for architecting ML workloads using Amazon SageMaker
- How Meesho built a generalized feed ranker using Amazon SageMaker inference
- Announcing Rekogniton Custom Moderation: Enhance accuracy of pre-trained Rekognition moderation models with your data
- Defect detection in high-resolution imagery using two-stage Amazon Rekognition Custom Labels models
- Automatically redact PII for machine learning using Amazon SageMaker Data Wrangler
- Optimize pet profiles for Purina’s Petfinder application using Amazon Rekognition Custom Labels and AWS Step Functions
- Learn how Amazon Pharmacy created their LLM-based chat-bot using Amazon SageMaker
- Keeping an eye on your cattle using AI technology
- Personalize your search results with Amazon Personalize and Amazon OpenSearch Service integration
- How Veriff decreased deployment time by 80% using Amazon SageMaker multi-model endpoints
Announcements, Updates, and Launches
Announcements, Updates, and Launches
Rotate Your SSL/TLS Certificates Now – Amazon RDS and Amazon Aurora Expire in 2024
Amazon Relational Database Service (Amazon RDS) recently announced a Certificate Update in the console, meaning that if users use or plan to use Secure Sockets Layer (SSL) or Transport Layer Security (TLS) with certificate verification to connect to their database instances of Amazon RDS for MySQL, MariaDB, SQL Server, Oracle, PostgreSQL, and others, they must rotate their certificates by the expiration date of 2024. To do so, users will need to obtain a new certificate from their own Certificate Authority, upload this certificate to their Amazon RDS instances via the Amazon RDS console, API, or AWS CLI, and then select the new certificate as the preferred SSL/TLS certificate. KeyCore can provide assistance in this process, ensuring secure and reliable operation of your database instances.
Introducing Amazon MSK Replicator – Fully Managed Replication across MSK Clusters in Same or Different AWS Regions
Amazon has recently announced a new feature for their Amazon Managed Streaming for Apache Kafka (Amazon MSK) service – Amazon MSK Replicator. This feature enables users to replicate data from one cluster to another. This was done to enable business continuity and disaster recovery plans, as well as replicating data streams across different regions. To use this feature, users can specify the source and destination Amazon MSK clusters, the topics to replicate, and the replicator configuration. The replicator will then copy data from the source cluster to the destination cluster, ensuring users have access to their data from multiple locations. KeyCore can provide assistance in setting up and configuring this feature for your Amazon MSK clusters.
New Customization Capability in Amazon CodeWhisperer Generates Even Better Suggestions (Preview)
AI coding companions, such as Amazon CodeWhisperer, help developers write code quickly and securely. To provide developers with more personalized suggestions, CodeWhisperer now offers customization capabilities. This allows developers to train the AI based on their own libraries and APIs, providing them with code recommendations tailored to their needs. With the help of KeyCore, developers can make sure their customized CodeWhisper is properly set up and providing the most relevant code recommendations possible.
New – Seventh Generation Memory-optimized Amazon EC2 Instances (R7i)
Amazon has recently added memory-optimized Amazon EC2 R7i instances to their seventh-generation x86-based offerings. These instances are powered by custom 4th Gen Intel Xeon Scalable processors, enabling up to 32 vCPUs and 256 GiB of memory. R7i instances use Local NVMe storage for both the root device and any additional EBS-optimized volumes, providing up to 3.3 GB/s of disk throughput per vCPU and 5.2 million IOPS. KeyCore can help you determine if the R7i instances are the right fit for your workloads, and provide a custom solution tailored to your individual needs.
AWS Weekly Roundup – EBS Status Check, Textract Custom Queries, Amazon Linux 2, and more – October 16, 2023
This week, Amazon has announced several new features designed to improve developers’ experience with AWS. These include enhanced EBS volume status checks, Textract Custom Queries, and a new version of Amazon Linux 2. Additionally, Amazon has launched a preview of their Amazon MSK Replicator feature, which allows users to replicate data from one cluster to another. Finally, Amazon has added memory-optimized seventh-generation x86-based Amazon EC2 R7i instances to their lineup. With the help of KeyCore, developers can make sure they are taking full advantage of all these features, and ensuring the security, reliability, and scalability of their applications.
Read the full blog posts from AWS
- Rotate Your SSL/TLS Certificates Now – Amazon RDS and Amazon Aurora Expire in 2024
- Introducing Amazon MSK Replicator – Fully Managed Replication across MSK Clusters in Same or Different AWS Regions
- New Customization Capability in Amazon CodeWhisperer Generates Even Better Suggestions (Preview)
- New – Seventh Generation Memory-optimized Amazon EC2 Instances (R7i)
- AWS Weekly Roundup – EBS Status Check, Textract Custom Queries, Amazon Linux 2, and more – October 16, 2023
Containers
Containers
Build ROSA Clusters with Terraform
Amazon recently released the official Red Hat Cloud Services Provider for Terraform, allowing customers to automate the provisioning of Red Hat OpenShift Service on AWS clusters (ROSA). Prior to this, ROSA cluster automation was only available using the OpenShift Command Line Interface (CLI), either by wrapping it in code or using other services. KeyCore can help customers with their ROSA cluster provisioning by utilizing our experience with Terraform and AWS to ensure automating these clusters is done in the most cost effective and secure way possible.
PBS Speeds Deployment and Reduces Costs with AWS Fargate
A co-authored blog post by Mike Norton (VP Cloud Services & Operations, PBS), Warrick St. Jean (Sr. Director Solution Architect, PBS), and Brian Link (Director, Technical Operations, PBS) discussed how PBS, an American public television station, utilized AWS Fargate to reduce deployment time and costs. With the help of AWS Fargate, PBS was able to migrate their hosting to the cloud and improve their user experience with services such as Amazon Elastic Container Service (ECS) and AWS Lambda. KeyCore can help customers with their migration to the cloud, leveraging our experience with AWS Fargate and other AWS services to ensure a secure and cost effective migration process.
Reduce Container Startup Time on Amazon EKS with Bottlerocket Data Volume
Many customers are turning to containers to deploy modern and scalable applications, but container boot time can be a challenge, especially with workloads that require large container images. For instance, data analytics and machine learning workloads can have images that exceed 1 GiB in size. To address this challenge, AWS released Bottlerocket Data Volume, a feature that is available on Amazon Elastic Kubernetes Service (Amazon EKS) and reduces container startup time. KeyCore can help customers to reduce their container startup time on Amazon EKS by leveraging our experience with Bottlerocket Data Volume and other Amazon EKS features.
Build a Multi-Tenant Chatbot with RAG Using Amazon Bedrock and Amazon EKS
Generative AI models have enabled customers to build chatbot applications that cater to a wide range of their end-customers, each with its own specialized contextual information. To run these multi-tenant applications at scale, customers require an infrastructure that is cost-efficient and familiar to their development teams. Amazon Bedrock and Amazon EKS can provide this infrastructure, allowing customers to host their chatbot applications with minimal effort. KeyCore can help customers to build their multi-tenant chatbot applications by leveraging our experience with Amazon Bedrock and Amazon EKS.
Manage Scale-to-Zero Scenarios with Karpenter and Serverless
Cluster Autoscaler has been the industry standard autoscaling mechanism for Kubernetes since its early versions. However, with the complexity and number of containerized workloads increasing, customers running on Amazon Elastic Kubernetes Service (Amazon EKS) have started asking for a more flexible way to allocate compute resources to their applications. Karpenter and Serverless can provide this flexibility, allowing customers to scale down their applications to zero when not needed. KeyCore can help customers to manage their scale-to-zero scenarios by leveraging our experience with Karpenter and Serverless.
Read the full blog posts from AWS
- Build ROSA Clusters with Terraform
- PBS speeds deployment and reduces costs with AWS Fargate
- Reduce container startup time on Amazon EKS with Bottlerocket data volume
- Build a multi-tenant chatbot with RAG using Amazon Bedrock and Amazon EKS
- Manage scale-to-zero scenarios with Karpenter and Serverless
AWS Quantum Technologies Blog
Running Python Code with Amazon Braket and Temperature-Resistant Packaging for Optical Devices
Running Python Code with Amazon Braket
Amazon Braket is a service that gives researchers access to quantum computing, classical computing, and a variety of hardware technologies. Recently, Amazon Braket has made it easier for researchers to run their local Python code as an Amazon Braket Hybrid Job with minimal code changes. Through the use of a python decorator from the Amazon Braket SDK, researchers are now able to execute local Python functions as an Amazon Braket Hybrid Job with just one extra line of code.
At KeyCore, we understand the importance of these advances in quantum computing, and we provide both professional services and managed services to help customers take advantage of Amazon Braket. Our team of engineers specialize in AWS and have experience in integration, configuration, and deployment of Amazon Braket.
Temperature-Resistant Packaging for Optical Devices
Amazon Braket has also recently announced a new temperature-resistant packaging technique for optical devices. This fiber-device interface can withstand multiple cycles of cooling to cryogenic temperatures and back without introducing additional losses.
The KeyCore team is dedicated to staying on top of the latest advances in AWS and quantum computing. We provide our customers with the expertise and know-how to implement the latest technologies. Our AWS consultants understand the complexities of temperature-resistant packaging, and we can help customers to take advantage of this new technique.
At KeyCore, we specialize in helping customers get the most out of their AWS solutions. Our experts understand the complexities of the latest technologies and can provide guidance and support throughout the entire development process. Contact us today to learn more about how we can help with Amazon Braket and temperature-resistant packaging.
Read the full blog posts from AWS
- Explore quantum algorithms faster by running your local Python code as an Amazon Braket Hybrid Job with minimal code changes
- Introducing a new temperature-resistant packaging technique for optical devices
AWS Smart Business Blog
Saving Millions in Healthcare Claims Processing Costs with AI Workflows
MDaudit, a customer of Amazon Web Services (AWS), saw a challenge as an opportunity to integrate artificial intelligence (AI) into its auditing workflow to improve operations. With their growing SMB, MDaudit needed an automated system to scale up quickly and efficiently.
The Challenge
MDaudit’s customers rely on them to process healthcare claims quickly and accurately. To ensure this, MDaudit manually audits thousands of claims and documents daily. However, the manual process was slow and labor-intensive. It took days to audit even a single claim, and the manual process was subject to human error.
The Solution
MDaudit decided to use AWS AI services to automate their process and reduce the time it took to audit a claim. They used Amazon Comprehend to extract key information from the documents. Then, they built an Amazon SageMaker machine learning model to accurately detect anomalies in the claims. This allowed them to quickly identify fraudulent or incorrect claims in an automated manner. Finally, MDaudit used Amazon Rekognition to scan the documents for further validation.
The Results
By automating their claims processing with AI, MDaudit was able to save time and money. Their claims processing time was reduced from days to minutes, and their accuracy increased significantly. This resulted in millions of dollars saved in healthcare claims processing costs.
How KeyCore Can Help
At KeyCore, we offer professional services and managed services to assist your business in taking advantage of the power of AI. Our experienced team can help you build a custom AI solution that fits the needs of your business. We can also help you scale up your AI solution and optimize its performance. Contact us today to get started!
Read the full blog posts from AWS
Official Database Blog of Amazon Web Services
A Guide to Utilizing Amazon Web Services for Digital Asset Management and Database Migration
Build a Web-Based Cryptocurrency Wallet Tracker with Amazon Managed Blockchain Access and Query
Companies of all sizes are increasingly offering digital asset products that enable end users to buy, sell, exchange, and track their digital assets such as cryptocurrency. Amazon Managed Blockchain Access and Query makes it easy to build cryptocurrency wallet trackers that offer users a secure and seamless experience. To use Amazon Managed Blockchain Access and Query for digital asset management, you must first create an Amazon Managed Blockchain network and configure it for the type of digital asset you are using. After that, you can configure the network to enable access and query for your users. Amazon Managed Blockchain Access and Query provides APIs for creating and managing users and their permissions. You can also build a web-based user interface to provide your users with an easy-to-use experience. With Amazon Managed Blockchain Access and Query, you can quickly and securely build a web-based cryptocurrency wallet tracker.
Accelerate Database Migration Planning with AWS DMS Fleet Advisor
Migrating on-premises databases and analytics resources can be time consuming, requiring expertise in building inventory, designing a migration strategy, and finalizing migration targets. AWS DMS Fleet Advisor simplifies the process by providing a step-by-step guide for creating a migration plan. It also helps you understand the scope and timeline for the migration, and provides cost estimates for the process. AWS DMS Fleet Advisor can be used for both homogenous and heterogenous migrations, and it supports databases such as Oracle, MySQL, PostgreSQL, and Microsoft SQL Server. With AWS DMS Fleet Advisor, you can accelerate your database migration planning and reduce the time and cost associated with the process.
Monitor and Alert on DDL and DCL Changes in Amazon RDS
Using Amazon CloudWatch, Amazon RDS Performance Insights, and Enhanced Monitoring, you can easily monitor and alert on changes in your Amazon RDS databases. Amazon CloudWatch alerts are triggered when specific performance metrics are breached. This can help you quickly identify issues in your database and take the necessary action to resolve them. Additionally, Amazon RDS Performance Insights provides a visualization of your database’s performance over time that you can use to quickly spot any changes or anomalies. Finally, Enhanced Monitoring provides detailed metrics on the inner workings of your database, such as CPU utilization and memory usage. With Amazon RDS monitoring and alerts, you can quickly and easily detect and address issues in your databases.
The Convergence of AI and Digital Assets: A New Dawn for Financial Infrastructure
The financial landscape is constantly evolving, with innovation at its core. The convergence of artificial intelligence (AI) and digital assets such as cryptocurrencies, central bank digital currencies (CBDCs), and tokenized assets has the potential to revolutionize the financial industry. AI-powered technologies can be used to automate processes and reduce costs in the financial sector. Additionally, AI-driven analytics can provide insights into financial transactions and help organizations in risk management. As AI and digital assets become more widespread in the financial sector, organizations must adopt effective strategies to leverage their potential and gain a competitive edge.
Migrate IBM Db2 LUW to Amazon Aurora PostgreSQL or Amazon RDS for PostgreSQL
Database migrations from IBM Db2 LUW to Amazon Aurora PostgreSQL and Amazon RDS for PostgreSQL can be complex. It is important to understand the challenges associated with schema conversions and prepare accordingly. The native EXPORT and COPY commands can be used to perform the data migration. Additionally, you can use AWS Database Migration Service (AWS DMS) to automate the entire process, including schema conversion, data replication, and validation. AWS DMS also supports homogenous and heterogenous migrations, so it is a great option for migrating from IBM Db2 LUW to Amazon Aurora PostgreSQL or Amazon RDS for PostgreSQL.
Enable and Optimize Audits on Amazon RDS for SQL Server
Many regulations require organizations to maintain database audit logs to demonstrate compliance with various data privacy and regulatory obligations. With Amazon RDS for SQL Server, you can easily set up audits to ensure that your databases remain compliant. You can use Amazon CloudWatch Logs to configure audit policies, which will be triggered whenever changes are made to the database. CloudWatch will then forward the audit logs to an Amazon S3 bucket for storage. This allows you to easily store, monitor, and analyze your audit logs.
Unlock the Power of the AWS CLI for Amazon RDS and Amazon Aurora
The AWS Command Line Interface (AWS CLI) makes it easy to manage Amazon RDS and Amazon Aurora from the command line. You can use shell commands to create, modify, delete, and generate reports for various objects, such as database instances, clusters, parameter groups, and more. The AWS CLI also integrates seamlessly with other AWS services, allowing you to automate tasks and streamline your database management processes. With the AWS CLI, you can unlock the full power of Amazon RDS and Amazon Aurora.
Create a Centralized Repository for Oracle AWR Reports Using Amazon EFS Integration for Amazon RDS for Oracle
Amazon RDS for Oracle simplifies the process of managing and maintaining an Oracle database. It also makes it easy to create a centralized repository for Oracle Automatic Workload Repository (AWR) reports using Amazon EFS integration. With Amazon EFS integration, you can store AWR reports on a shared file system so that they can be easily accessed and analyzed. This makes it easier to track and monitor performance changes over time.
Implement an Automated Approach for Handling AWS DMS Operational Events
AWS Database Migration Service (AWS DMS) makes it easy to migrate databases, both homogenous and heterogenous. However, it is important to monitor the migration process for changes and operational events. With Amazon CloudWatch, you can easily set up alarms and notifications to alert you of any changes or anomalies. Additionally, the CloudWatch Events feature can be used to trigger automated responses to AWS DMS operational events. By utilizing Amazon CloudWatch, you can create an automated approach for managing AWS DMS operational events.
Deploy Amazon RDS Custom for Oracle Using Terraform Modules
Amazon RDS Custom for Oracle allows you to customize the operating system and database environment for your applications. It also integrates seamlessly with other AWS services, making it easy to deploy and manage. With Terraform modules, you can automate the deployment of Amazon RDS Custom for Oracle. Terraform modules can help you configure the database and define the resources you need to launch a new instance. Additionally, you can use Terraform modules to manage existing instances and make changes to them over time. With Terraform modules, you can easily deploy and manage Amazon RDS Custom for Oracle.
Visualize Ethereum ERC20 Token Data Using Amazon Managed Blockchain Query and Amazon QuickSight
Amazon Managed Blockchain (AMB) Query and Amazon QuickSight can be used to visualize Ethereum ERC20 token data. AMB Query is an easy-to-use graphical query tool that allows you to explore and analyze data stored in your Amazon Managed Blockchain network. You can use AMB Query to identify common token metrics such as top holders, daily active users, daily volume, total number of holders, latest transfers, and more. Amazon QuickSight makes it easy to visualize the data from AMB Query so that you can quickly identify trends and gain insights. By using Amazon Managed Blockchain Query and Amazon QuickSight, you can easily visualize Ethereum ERC20 token data.
Accelerate Migrations to Amazon DocumentDB Using AWS DMS
Amazon DocumentDB is a fully managed native JSON document database that makes it easy and cost effective to operate document workloads at scale. With AWS Database Migration Service (AWS DMS), you can easily migrate your databases to Amazon DocumentDB. AWS DMS supports both homogenous and heterogenous migrations, and it can automate the entire migration process, including schema conversion, data replication, and validation. AWS DMS also provides detailed logs and metrics that you can use to monitor the progress and success of your migration. With AWS DMS, you can accelerate your migration to Amazon DocumentDB.
KeyCore Can Help You Unlock the Potential of Amazon Web Services
At KeyCore, we provide both professional services and managed services to help organizations unlock the full potential of Amazon Web Services. Our team of experienced AWS experts is equipped to help you build a web-based cryptocurrency wallet tracker, accelerate your database migration planning, monitor and alert on DDL and DCL changes, take advantage of the convergence of AI and digital assets, migrate IBM Db2 LUW, enable and optimize audits, unlock the power of the AWS CLI, create a centralized repository for Oracle AWR reports, implement an automated approach for handling AWS DMS operational events, and accelerate migrations to Amazon DocumentDB. Contact us today to learn more about how we can help you maximize the potential of Amazon Web Services.
Read the full blog posts from AWS
- Build a web-based cryptocurrency wallet tracker with Amazon Managed Blockchain Access and Query
- Accelerate database migration planning with AWS DMS Fleet Advisor
- Monitor and alert on DDL and DCL changes in Amazon RDS for MySQL, Amazon RDS for MariaDB, and Amazon Aurora MySQL
- The convergence of AI and digital assets: A new dawn for financial infrastructure
- Migrate IBM Db2 LUW to Amazon Aurora PostgreSQL or Amazon RDS for PostgreSQL
- How to enable and optimize audits on Amazon RDS for SQL Server
- Unlock the power of the AWS CLI for Amazon RDS and Amazon Aurora
- Create a centralized repository for Oracle AWR reports using Amazon EFS integration for Amazon RDS for Oracle
- Implement an automated approach for handling AWS DMS operational events
- Deploy Amazon RDS Custom for Oracle using Terraform modules
- Visualize Ethereum ERC20 token data using Amazon Managed Blockchain Query and Amazon QuickSight
- Accelerate migrations to Amazon DocumentDB using AWS DMS
AWS Cloud Financial Management
AWS Cloud Financial Management Services Now Available
AWS has joined the FinOps Foundation as a Premier Member, making their AWS Cloud Financial Management (CFM) services more widely available. This industry-leading solution allows organizations to gain better control over their cloud costs by providing visibility and cost optimization, enabling efficient and cost-effective cloud operations.
What is the FinOps Foundation?
The FinOps Foundation is a nonprofit organization that provides professional certification and resources to FinOps teams. FinOps teams are responsible for managing and controlling cloud costs in organizations, and the Foundation offers professional certifications, training, and guidance to ensure successful cost management.
What Benefits Does AWS CFM Offer?
AWS CFM offers a number of benefits to organizations. It provides visibility into cloud costs, allowing organizations to identify trends and opportunities to optimize their costs. It also provides cost optimization capabilities, which allow organizations to design and implement cost optimization strategies. Additionally, AWS CFM provides an industry-leading cost management solution that enables organizations to efficiently manage their cloud operations.
How Can KeyCore Help?
KeyCore can help organizations get the most out of AWS CFM. Our experienced AWS professionals can provide guidance on cost optimization strategies, assist in designing and implementing cost optimization plans, and help organizations take full advantage of the cost savings opportunities offered by AWS CFM. Additionally, KeyCore can provide ongoing support and monitoring to ensure organizations’ cloud costs remain optimized.
Read the full blog posts from AWS
AWS for Games Blog
Norsfell Builds ‘Tribes of Midgard’ Viking Empire with AWS GameLift
Norsfell is a game development studio, founded in 2013 with the goal of creating new genres to bring people together. After 10 years, the team has grown from developing mobile games to a multi-platform game, “Tribes of Midgard.” The highly successful game was launched on July 27, 2020, and has been gaining attention due to its ability to be hosted in the cloud with Amazon GameLift.
Why Norsfell Chose AWS GameLift
In order for players to have smooth, lag-free gameplay, Norsfell needed a platform that could handle the scale and complexity of their game. After comparing various services, they chose GameLift due to its robust features that allowed them to run an efficient, high-performance game. In addition, the service offers features such as automated scaling, matchmaking and real-time analytics. GameLift also supports cross-platform play, enabling PC gamers to join the same session as players on PlayStation and Nintendo Switch.
Benefits of AWS GameLift
GameLift offers Norsfell the ability to quickly deploy and operate their game while also providing a secure and stable experience for their players. The service allows the game to run on low-cost AWS infrastructure, eliminating the need to maintain and scale their own servers. This significantly reduces the game’s operational costs, allowing Norsfell to focus on creating a high-quality gaming experience for their players.
How KeyCore Can Help
At KeyCore, we specialize in providing professional and managed services for AWS. Our team of experienced engineers can help you leverage the power of AWS GameLift to deploy and operate your game, ensuring a secure and stable experience for your players. We can also help you optimize the cost of running your game, so you can focus on creating an amazing gaming experience.
Read the full blog posts from AWS
AWS Training and Certification Blog
Unlock the full potential of Amazon CodeWhisperer and develop AWS Cloud skills with AWS Jam and AWS Solutions-Focused Immersion Days
AWS Jam is a practical and interactive learning experience designed to challenge teams and individuals to apply their AWS Cloud skills in a sandbox environment. With Amazon CodeWhisperer, an AI coding assistant trained on billions of lines of code, AWS Jam can help you improve your speed and security when building applications. Plus, AWS Solutions-Focused Immersion Days offer free events and hands on labs to further develop your AWS Cloud knowledge.
Gain practical experience building with Amazon CodeWhisperer through AWS Jam
AWS Jam is a gamified learning experience that challenges individuals or teams to use their AWS Cloud skills to solve real-world, open-ended problems in a sandbox AWS environment. By leveraging Amazon CodeWhisperer – an artificial intelligence (AI) coding companion – builders can use suggestions to quickly and securely create applications. AWS Jam offers a great way to learn how to use Amazon CodeWhisperer and gain practical experience.
Boost your AWS proficiency with Solution-Focused Immersion Days
AWS Solutions-Focused Immersion Days are a series of events that are designed to help you develop the skills needed to build, deploy, and operate your infrastructure and applications in the cloud. You can access hands-on labs in the AWS console with guidance from AWS Solutions Architects and subject matter experts. Sign up for these free events and expand your AWS Cloud knowledge.
Stellantis: driving innovation by investing in employees’ digital skills
Stellantis is dedicated to ensuring it has the tech talent required to enable their transition to zero-emission connected vehicles. To do this, they are upskilling and reskilling employees from across the company to learn a new field or update their digital, tech, and cloud skills. By inspiring organization-wide cloud adoption and unlocking innovation with its holistic cloud skilling strategy, Stellantis is making great strides in driving innovation.
How KeyCore can help
At KeyCore – Denmark’s leading AWS consultancy – we provide professional and managed services to help you get the most out of the AWS Cloud. Our team is highly experienced in AWS and can guide you through best practices and help you develop your AWS Cloud knowledge. Reach out to us and learn about how we can help you unlock the potential of Amazon CodeWhisperer and AWS Cloud services.
Read the full blog posts from AWS
- Gain practical experience building with Amazon CodeWhisperer through AWS Jam
- Boost your AWS proficiency with Solution-Focused Immersion Days
- Stellantis: driving innovation by investing in employees’ digital skills
Microsoft Workloads on AWS
Microsoft Workloads on AWS
Microsoft’s .NET Framework was introduced in 2002, providing developers with a software platform for Windows to rapidly create business applications and simplify complex programming tasks. Organizations of all sizes have adopted .NET Framework and thousands of developers around the globe use it to create software for their business models.
Reserve your seat: Microsoft workloads on AWS sessions at re:Invent 2023
AWS re:Invent 2023 is less than six weeks away! With over 2,200 sessions across six venues this year, it is an excellent opportunity to expand your skill set and network of AWS enthusiasts and builders. There are many sessions designed to help developers get the most out of Microsoft workloads on AWS, including those focused on .NET Framework and Windows Server workloads. These sessions cover topics such as setting up a .NET development environment in AWS, optimizing applications and services for Windows Server, and running Windows Server containers on AWS.
At KeyCore, our experienced AWS developers and consultants can help you make the most of your Microsoft workloads on AWS. We provide professional services and managed services to help you get the most out of your Windows Server, .NET Framework, and other Microsoft workloads. We can also provide guidance on setting up and managing your .NET development environment in AWS. Contact us today to learn more about how we can help you make the most of your Microsoft workloads on AWS.
Read the full blog posts from AWS
- Refactor to Modern .NET and Move to Linux
- Reserve your seat: Microsoft workloads on AWS sessions at re:Invent 2023
Official Big Data Blog of Amazon Web Services
Unlock the Power of Data Warehousing with Amazon Web Services
Data warehousing is a powerful way to unlock the potential of your data. With Amazon Web Services (AWS), you can use a variety of services to securely store, manage, and analyze data in the cloud. In this post, we will explore how to use Amazon MSK Connect, AWS Glue, OpenSearch Service, and Amazon Redshift to make the most of your data.
Resolve Private DNS Hostnames for Amazon MSK Connect
Amazon MSK Connect is a feature of Amazon Managed Streaming for Apache Kafka (Amazon MSK) that offers a fully managed Apache Kafka Connect environment. With MSK Connect, you can deploy fully managed connectors built for Kafka Connect that move data into or pull data from popular data stores like Amazon S3 and Amazon Redshift.
Migrate Data from Azure Blob Storage to Amazon S3 Using AWS Glue
You can use AWS Glue to migrate data from Azure Blob Storage into Amazon S3. Prerequisites include subscribing to the connector in AWS Marketplace, creating and running AWS Glue for Apache Spark jobs, and understanding the differences between the Azure Data Lake Storage Gen2 Connector and the AWS Glue Connector.
SmugMug’s Durable Search Pipelines for Amazon OpenSearch Service
SmugMug has used Amazon CloudSearch since 2012, and has since grown to use OpenSearch Service. OpenSearch Service helps to store, search, share, and sell tens of billions of photos for its customers. Snapshot Management helps create point-in-time backups of your domain using OpenSearch Dashboards, including both data and configuration settings.
Load Data Incrementally From Transactional Data Lakes to Data Warehouses
Data lakes and data warehouses are two of the most important data storage and management technologies in a modern data architecture. Data lakes store all of an organization’s data, regardless of its format or structure. One way to move data from a transactional data lake into a data warehouse is through incremental loading.
Enhance Your Security Posture with Amazon Redshift Admin Credentials
Amazon Redshift is a powerful data warehousing service in the cloud, and tens of thousands of AWS customers use it to run mission-critical business intelligence (BI) dashboards, queries, and reporting. With AWS Secrets Manager integration, you can store Amazon Redshift admin credentials and other sensitive information without having to access it manually.
Migrate Microsoft Azure Synapse Analytics to Amazon Redshift Using AWS SCT
AWS Schema Conversion Tool (AWS SCT) and AWS SCT data extraction agents can be used to migrate a data warehouse from Microsoft Azure Synapse to Redshift Serverless. AWS SCT makes heterogeneous database migrations predictable by automatically converting the source database code and storage objects to a format compatible with the target database.
Run Apache Hive Workloads Using Spark SQL with Amazon EMR on EKS
Apache Hive is a distributed, fault-tolerant data warehouse system that enables analytics on a massive scale. Spark SQL is an Apache Spark module for structured data processing, and it can be used to run Hive workloads with the simplicity of SQL-like queries and the exceptional speed and performance that Spark provides.
Unleash the Power of Snapshot Management to Take Automated Snapshots
Snapshot Management helps you create point-in-time backups of your domain using OpenSearch Dashboards, including both data and configuration settings. You can use these snapshots to restore your cluster to a specific state, recover from potential failures, and even clone environments for testing or development purposes.
Accelerate Your Data Warehouse Migration to Amazon Redshift
Using AWS Schema Conversion Tool (AWS SCT), you can configure, start, and manage a change data capture (CDC) migration task. CDC tasks allow you to incrementally update data from a transactional data lake to a data warehouse. Performance and tuning can be improved by changing the settings on the task.
Processing Large Records with Amazon Kinesis Data Streams
Kinesis Data Streams can be used to process large records. Different options for handling large records are highlighted, along with sample code, to help you get started with any of these approaches with your own workloads.
By leveraging the power of AWS, you can unlock the potential of your data. At KeyCore, our team of AWS experts can help you get the most out of your data warehousing strategy, and work with you to find a solution that meets your needs. Contact us today to learn more.
Read the full blog posts from AWS
- Resolve private DNS hostnames for Amazon MSK Connect
- Migrate data from Azure Blob Storage to Amazon S3 using AWS Glue
- SmugMug’s durable search pipelines for Amazon OpenSearch Service
- Load data incrementally from transactional data lakes to data warehouses
- Enhance your security posture by storing Amazon Redshift admin credentials without human intervention using AWS Secrets Manager integration
- Migrate Microsoft Azure Synapse Analytics to Amazon Redshift using AWS SCT
- Run Apache Hive workloads using Spark SQL with Amazon EMR on EKS
- Unleash the power of Snapshot Management to take automated snapshots using Amazon OpenSearch Service
- Accelerate your data warehouse migration to Amazon Redshift – Part 7
- Processing large records with Amazon Kinesis Data Streams
AWS Compute Blog
Maintaining Highly Available Connectivity Between On-Premise and AWS Local Zones
Organizations often rely on data replication strategies when it comes to backing up data across multiple locations. AWS Local Zones provide customers with the ability to replicate their data and maintain a local copy of it using a variety of strategies. In this article, we’ll discuss database replication, file- and object storage replication, and partner solutions for Amazon Elastic Compute Cloud (Amazon EC2) that enable customers to maintain a local copy of their data.
Database Replication
Database replication can provide customers with the ability to replicate their data using a variety of strategies. Customers running application workloads in AWS Local Zones can employ multi-master, multi-zone database replication, allowing them to maintain a local copy of their data in a different AWS Local Zone or on-premises. This approach gives customers the assurance that data is always available and in sync across multiple locations.
File and Object Storage Replication
File- and object storage replication is another option for customers to consider when it comes to maintaining a local copy of their data. With the right combination of infrastructure, customers can replicate their file- and object storage data from one AWS Local Zone to another or on-premises, ensuring that their data is always available.
Partner Solutions for Amazon EC2
AWS partners offer a variety of solutions for customers looking to replicate their data across AWS Local Zones and on-premises. These solutions enable customers to maintain a local copy of their data in an easy and efficient manner. Additionally, customers can use AWS Direct Connect to provide redundant connectivity between private networks in the cloud and on-premises environments.
Training Machine Learning Models on-Premises
Organizations may also wish to train machine learning (ML) models on-premises, using AWS Outposts rack and datasets stored locally in Amazon S3 on Outposts. With the rise of data sovereignty and privacy regulations, organizations must ensure that their data is stored in secure locations and not replicated to any other locations. Training ML models on-premises allows organizations to stay compliant with data residency guidelines, while still taking advantage of the scalability and performance of AWS Outposts.
AWS Serverless ICYMI Q3 2023
The AWS Serverless ICYMI (In Case You Missed It) Quarterly Recap provides customers with a comprehensive look into the most recent product launches, feature enhancements, blog posts, webinars, live streams, and other updates from AWS. This is an invaluable resource for customers to stay up-to-date with the latest developments from AWS.
How KeyCore Can Help
KeyCore is the leading AWS consultancy in Denmark offering both professional services and managed services. Our experienced team of AWS consultants can help organizations design, develop, and deploy applications on AWS Local Zones that are reliable, secure, and cost-effective. Our team can also help organizations stay compliant with data residency guidelines when they are training ML models on-premises using AWS Outposts rack. Contact us today to learn more about our services.
Read the full blog posts from AWS
- Maintaining a local copy of your data in AWS Local Zones
- Enabling highly available connectivity from on premises to AWS Local Zones
- Training machine learning models on premises for data residency with AWS Outposts rack
- Serverless ICYMI Q3 2023
AWS for M&E Blog
AWS for M&E Celebrates Achievements and Commitment to Expansion
Awards for Technology Advancements at IBC 2023
Amazon Web Services (AWS) celebrated several recent achievements in October, including five industry awards for technology advancements spotlighted at IBC 2023 in September. The AWS Elemental Link UHD MediaConnect Integration took home a TV Tech Product Innovation Award and TVBEurope Best of IBC Show Award. Amazon IVS Real-Time Streaming earned both awards as well.
Bardel Entertainment Commits to Expansion with AWS
Vancouver-based animation studio Bardel Entertainment has committed to expansion with AWS. The studio works with industry leaders such as Netflix, Disney+, HBO, Nickelodeon, Cartoon Network, DreamWorks, and Warner Brothers to create imaginative primetime programming for all ages, as well as content for film and commercials. This expansion will enable Bardel to take advantage of AWS’s wide range of managed services and technologies to quickly and cost-effectively scale for their needs.
M&E and Sports Programming Highlights at re:Invent 2023
AWS re:Invent is an annual event in Las Vegas that attracts thousands of attendees from around the world. The event offers an array of presentations, panel discussions, hands-on learning sessions, networking opportunities, and more. To make it easier for attendees to find relevant topics, the AWS for Media and Entertainment (M&E) team curated a list of sessions and events related to M&E and sports programming.
KeyCore Can Help Take Advantage of AWS for M&E
At KeyCore, we offer professional and managed services that help customers take advantage of AWS for media and entertainment. Our experienced consultants can help you get the most out of AWS, whether you are exploring streaming services capabilities or require help with the migration process. We provide the expertise to help you get the most out of the latest offerings from AWS. Contact us today to find out more.
Read the full blog posts from AWS
- AWS Media Services awarded industry accolades
- Bardel Entertainment commits to expansion with AWS
- M&E and Sports programming highlights at re:Invent 2023
AWS Storage Blog
Transferring Data from Google Cloud Filestore to Amazon EFS Using AWS DataSync
Organizations may need to transfer large numbers of files from one cloud provider to another for a variety of reasons such as workload migration, disaster recovery, or a requirement to process data in other clouds. Transferring this data can be tedious, as it requires end-to-end encryption, the ability to detect changes, object validation, network throttling, monitoring, and cost optimization. Fortunately, AWS DataSync can be used to smoothly transfer data from Google Cloud Filestore to Amazon EFS.
What is AWS DataSync?
AWS DataSync is a managed data transfer service that makes it easy to move large amounts of data between different AWS storage services, such as Amazon S3, Amazon EFS, Amazon FSx for Windows File Server, and Amazon FSx for Lustre, as well as on-premises storage. AWS DataSync is optimized for high-performance transfers and can detect and copy only the blocks that have changed in a file, which helps to reduce transfer time and costs.
How Does AWS DataSync Work?
AWS DataSync works by setting up a source, a destination, and a task. The source is the data that needs to be transferred, and the destination is the location where the data will be transferred. The task contains the schedule and other settings, such as the transfer rate, encryption, object validation, and network optimization.
AWS DataSync is a managed service, which means that there is no need to install, configure, or maintain any hardware or software. All data transfers are encrypted in transit and at rest, and all transfers are monitored and logged for audit purposes.
Transferring Data From Google Cloud Filestore to Amazon EFS Using AWS DataSync
AWS DataSync can be used to move data from Google Cloud Filestore to Amazon EFS. Google Cloud Filestore is a managed service that provides high-performance file storage for applications running on Google Compute Engine or Google Kubernetes Engine. Amazon EFS is a managed file storage service for Amazon EC2 instances. To transfer data from Google Cloud Filestore to Amazon EFS, users must set up an AWS DataSync task, select their source and destination, and configure their settings. The transfer can then be monitored through the AWS DataSync console.
KeyCore’s AWS DataSync Solutions
At KeyCore, we provide professional and managed services to help customers move data using AWS DataSync. We can assist with setting up and configuring the DataSync task, choosing the right source and destination, and monitoring the transfer. Our team of experts also provides best practices and advice to ensure a successful data transfer. Contact us today to learn more about how we can help you move data using AWS DataSync.
Read the full blog posts from AWS
AWS Developer Tools Blog
Improved Initialization Patterns for the AWS SDK for .NET
The AWS SDK for .NET provides developers with a .NET-idiomatic experience for working with Amazon DynamoDB. The version 3.7.203 of AWSSDK.DynamoDBv2 introduced new initialization patterns for the document and object persistence models, to improve the application’s performance and reduce thread contention and throttling issues, which tend to occur when the first call to DynamoDB is made.
New Initialization Patterns for the AWS SDK for .NET
The new initialization patterns for the AWS SDK for .NET are based on the concept of “embering” or “pre-warming” the client. This involves creating the client instance upfront and then using it to perform several no-op or dummy queries. This helps to ensure that the necessary resources (threads, connections, etc.) are available when the application is ready to start making actual requests. The result is improved application performance, as throttling and long wait times for the first request are eliminated.
How Does the AWS SDK for .NET Help?
The AWS SDK for .NET provides two methods that developers can use to “ember” their applications. The first is the “Initialize” method, which can be used to create a client instance and perform a few no-op requests to ensure that the resources are available when the first real request is made. The second is the “Ember” method, which can be used to perform the same task without explicitly creating the client instance.
Using the Improved Initialization Patterns with KeyCore
At KeyCore, we are committed to helping developers take advantage of the latest AWS technologies and services. We provide professional services and managed services to help you optimize your AWS applications. Our experts can help you implement the improved initialization patterns for the AWS SDK for .NET, ensuring that your application runs smoothly and efficiently. To learn more about our offerings, please visit our website www.keycore.dk.
Read the full blog posts from AWS
AWS Partner Network (APN) Blog
AWS Partner Network (APN) Blog: Strategies, Patterns, and Security Measures for Integrating Infor CloudSuite with AWS
Infor OS provides deep integration capabilities for businesses to integrate their Infor and non-Infor enterprise systems, whether they’re on-premises, in the cloud, or both. Intelligent Open Network (ION) is an interoperability and business process management platform designed to do this. This post discusses general scenarios and integration patterns while using ION. Through Infor OS, organizations can access the Infor CloudSuite, which provides a suite of enterprise applications such as customer relationship management (CRM), enterprise resource planning (ERP), and supply chain management (SCM). With ION, organizations can integrate their Infor CloudSuite applications with other cloud and on-premises applications.
AWS Partners Can Conduct a High-Quality Cloud Readiness Assessment for Customers
When migrating workloads to the cloud, the right planning, expertise, and understanding of an organization’s readiness are crucial. AWS Partners have several options to conduct a cloud readiness consulting assessment for their customers, including a cloud readiness assessment that is a fundamental step in helping them migrate to the cloud. The AWS Cloud Adoption Framework provides guidance for assessing a business’s readiness for the cloud and can help in this process.
Congrats to Our 2023 AWS Ambassador Award Winners and Meet the Newest AWS Ambassadors
The AWS Ambassador Program is a community of AWS experts from AWS Partner Network (APN) Services and Software Partners. These prominent leaders are passionate about sharing their AWS expertise and the program recognizes the top AWS Ambassadors for their performance and thought leadership. Additionally, the AWS Ambassador Certification All-Stars have achieved all active AWS Certifications and are being recognized.
Simplifying Amazon EKS Adoption With Comprinno’s Pre-Crafted Well-Architected EKS Package
Amazon EKS simplifies Kubernetes control plane management and integrates with other AWS services, providing access to a robust ecosystem of tools and services. Comprinno, an AWS Partner, helps enterprises transform faster with enterprise DevOps and cloud-native computing. This post explores the architecture of a pre-crafted EKS solution and how it helps ease complexity.
Intelligent Email Response Management Using Amazon Connect and TCS RemacX
TCS RemacX allows customers to integrate the email channel with an agent desktop powered by Amazon Connect. This cloud-based omnichannel agent customer experience collaboration space powered by Amazon Connect provides context while leveraging AI to assist with first contact resolution. It also adheres to an established SLA that contact center supervisors can oversee and maintain.
BeyondTrust’s Identity Security Insights SaaS Offering, Supported by AWS SaaS Factory
This post delves into the BeyondTrust Identity Security Insights solution, which provides organizations with a comprehensive understanding of identities, privileges, and access. As cloud solutions are now integral components of business strategies and automation is deeply integrated into operations, the focus has shifted towards identity security.
Say Hello to 177 AWS Competency, MSP, Service Delivery, and Service Ready Partners Added or Renewed in September
177 AWS Partners received new or renewed designations in September for AWS Service Delivery, AWS Competency, AWS Managed Service Provider (MSP), and AWS Service Ready programs. These designations span workload, solution, and industry and can help AWS customers identify top AWS Partners capable of delivering on their core business objectives.
Navigating Security Challenges and Committing to the Cloud with Axonius and AWS
Migrating to the cloud without a comprehensive and contextual understanding of assets can be challenging. Axonius Cybersecurity Asset Management and AWS migration services can help lay a foundation for a customer’s cloud migration strategy. This post outlines how they can be used together.
Authenticate Kubecost Users with Application Load Balancer and Amazon Cognito
Kubecost is a Kubernetes and cloud cost management tool that helps customers monitor, track, optimize, and govern their cloud and Kubernetes spending. Many customers are looking for a cloud-native way to expose Kubecost UI for their internal team to access the costs report. This post describes how to authenticate Kubecost users via Application Load Balancer and Amazon Cognito.
Building a Data Lakehouse with Amazon S3 and Dremio on Apache Iceberg Tables
This post shows how to implement a data lakehouse using Amazon S3 and Dremio on Apache Iceberg, enabling data teams to quickly and easily keep up with data and analytics changes. Dremio, an AWS Partner, uses a data lake engine to deliver fast query speed and a self-service semantic layer operating directly against S3 data.
Simplify Application Networking with Amazon VPC Lattice and VMware Cloud on AWS
When customers migrate workloads into VMware Cloud on AWS, they need to address service-to-service connectivity requirements between existing applications and new services deployed using native AWS services. Amazon VPC Lattice can simplify inter-service communication across SDDCs and cloud-native environments, abstracting the underlying networking complexity.
Implementing a Snowflake-Centric Data Mesh on AWS for Scalability and Autonomy
A data mesh architecture is a recent approach to managing data in large organizations, improving scalability, agility, and autonomy. This post outlines an approach to implement a data mesh with Snowflake as the data platform and with many AWS services supporting all pillars of the data mesh architecture.
Integrating Mainframe Workloads into Your AWS Migration and Modernization Journey with OpenLegacy
Legacy technologies can be difficult to maintain and integrate with modern applications, channels, and architectures. Modernizing such systems requires specialized expertise and a carefully planned approach. OpenLegacy helps enterprises build APIs from legacy assets like mainframes and midrange systems, helping them untangle monolithic core systems.
KeyCore – Professional AWS Services for a Trusted Journey
Migrating your workload and integrating applications to the cloud require strategic planning, advanced expertise, and understanding of your organization’s readiness. KeyCore is an AWS Advanced Consulting Partner that provides both professional services and managed services. Our team of experienced AWS certified professionals works with you to identify, assess, and accelerate your cloud journey. We provide a seamless experience, freeing you to focus on your core business. Contact us to discuss how we can help you get the most out of your cloud migration and modernization efforts.
Read the full blog posts from AWS
- Strategies, Patterns, and Security Measures for Integrating Infor CloudSuite with AWS
- How AWS Partners Can Conduct a High-Quality Cloud Readiness Assessment for Customers
- Congrats to Our 2023 AWS Ambassador Award Winners and Meet the Newest AWS Ambassadors
- Simplifying Amazon EKS Adoption With Comprinno’s Pre-Crafted Well-Architected EKS Package
- Intelligent Email Response Management Using Amazon Connect and TCS RemacX
- BeyondTrust’s Identity Security Insights SaaS Offering, Supported by AWS SaaS Factory
- Say Hello to 177 AWS Competency, MSP, Service Delivery, and Service Ready Partners Added or Renewed in September
- Navigating Security Challenges and Committing to the Cloud with Axonius and AWS
- Authenticate Kubecost Users with Application Load Balancer and Amazon Cognito
- Building a Data Lakehouse with Amazon S3 and Dremio on Apache Iceberg Tables
- Simplify Application Networking with Amazon VPC Lattice and VMware Cloud on AWS
- Implementing a Snowflake-Centric Data Mesh on AWS for Scalability and Autonomy
- Integrating Mainframe Workloads into Your AWS Migration and Modernization Journey with OpenLegacy
AWS Cloud Enterprise Strategy Blog
Fostering a Data-Driven Culture with AWS
Creating a data-driven culture can be a daunting task. Daniel Kahneman’s worldwide bestseller Thinking, Fast and Slow, explains how humans make decisions either intuitively or logically. But in organizations, intuition can lead to long decision-making processes, while data-driven decisions lead to shorter ones.
The Benefits of a Data-Driven Culture
Data-driven decision making can have many benefits for your organization. It can help you gain new insights, provide evidence to support decisions, and create trust among customers. Additionally, it can help you identify areas of improvement and measure the success of your projects.
How AWS Can Help Create a Data-Driven Culture
AWS provides a range of tools and services that can help you foster a data-driven culture. With the AWS Data Lake, you can store and analyze large amounts of data quickly and easily. With Amazon SageMaker, you can create machine learning models to help you find insights. And with Amazon Machine Learning, you can use predictive models to better understand customer behavior.
Tips for Creating a Data-Driven Culture
To foster a data-driven culture, your organization needs to set clear objectives, communicate those objectives to all stakeholders, and ensure those objectives are met. Additionally, you need to have a culture of data-driven decision making that encourages employees to use data to make decisions.
Working with AWS to Create a Data-Driven Culture
At Keycore, we understand the importance of data-driven decision making. Our team of AWS certified professionals can help your organization implement the tools and services needed to create a data-driven culture. Our managed services offering can provide you with the expertise and guidance you need to use AWS services efficiently and effectively. Contact us today to learn more about how we can help you create a data-driven culture.
Read the full blog posts from AWS
AWS HPC Blog
Using Fargate with AWS Batch for Serverless Batch Architectures
Introducing Fargate and AWS Batch Support
AWS Batch recently added support for Graviton and Windows containers on AWS Fargate. Fargate is a serverless compute engine for containers that makes it easier to focus on building applications. While this feature was previously available in preview, it is now available in production. This makes AWS Batch on Fargate a powerful, serverless solution for batch workloads.
AWS Batch on Fargate supports features like large task sizes, configurable local storage, and multiple instance types, to make batch workloads easier to deploy and scale. It also allows customers to define their own networks and storage solutions, which helps optimize performance and reduce cost.
How KeyCore Can Help
At KeyCore, we offer professional services and managed services for customers looking to leverage AWS Batch on Fargate. Our experienced team of cloud experts can help you get started with AWS Batch on Fargate, as well as provide ongoing support and management. We can help you identify the right instance types and storage solutions to meet your needs, as well as provide advice on how to optimize performance and cost. Contact us today to learn more.
Introducing Login Nodes in AWS ParallelCluster
AWS ParallelCluster 3.7 now supports adding login nodes to clusters out of the box. Login nodes are a powerful tool that can be used to access services running on the head node, and can be used to launch interactive sessions. In this article, we’ll discuss how to set up login nodes using AWS ParallelCluster, as well as highlight some important tuning options for customizing your user experience.
Setting up Login Nodes
Setting up a login node using AWS ParallelCluster is a simple process. First, you’ll need to configure the login node settings in the config file. This includes setting the instance type for the login node, as well as the scheduler type and associated settings. Once you’ve done that, you can launch your cluster and access the login node either via SSH or the AWS Console.
Tuneable Options
AWS ParallelCluster also provides a variety of tuning options for customizing the login node experience. For instance, you can configure the login node settings to control which services are enabled, as well as which users have access to the login node. You can also configure the instance type for the login node, which can provide additional compute resources for applications running on the head node.
How KeyCore Can Help
At KeyCore, we are experienced in helping customers get started with AWS ParallelCluster. Our team of cloud experts can provide advice and support on setting up login nodes, as well as providing ongoing support and management. We can also provide advice on how to optimize performance and cost, as well as helping you identify the right instance types and scheduler types for your application. Contact us today to learn more.
Read the full blog posts from AWS
- Why you should use Fargate with AWS Batch for your serverless batch architectures
- Introducing login nodes in AWS ParallelCluster
AWS Cloud Operations & Migrations Blog
Successful Cloud Migration with AWS
Cloud migration is a difficult but rewarding endeavor, and AWS can be a great choice for organizations looking to make their jobs easier. To ensure success, it’s important to know the five biggest pitfalls to avoid and use the right tools to mitigate them. In this post, we will review these and explain how AWS cloud tools make the process simpler.
Top Five Cloud Migration Pitfalls
Stalled cloud migrations can be costly, both in terms of time and resources. To avoid a stall, you should be aware of the top five pitfalls and take action immediately if they arise. These pitfalls include:
- Data security and privacy issues
- Compatibility problems with existing applications
- Cost mismanagement and overspending
- Difficulties in transforming legacy applications to the cloud
- Inefficient use of resources
AWS CloudWatch Synthetics and AWS Systems Manager Parameter Store
Maintaining and improving end user experience is expensive and time-consuming. To make the process easier, AWS offers CloudWatch Synthetics and AWS Systems Manager Parameter Store. CloudWatch Synthetics allows businesses to create canaries to monitor multiple endpoints, while Parameter Store allows for centralized image administration for virtual machines and containers.
EC2 Image Builder
The AWS Well-Architected Operational Excellence Pillar recommends standardizing images, configuring them with the latest patches and hardening them to deploy securely. EC2 Image Builder simplifies this process by providing a single, automated approach for creating, managing and updating images, reducing operational overhead and making it easier to keep images up to date.
How KeyCore Can Help
At KeyCore, we are experienced in helping businesses in Denmark with their cloud migrations. Our team of AWS-certified consultants are available to help you make sure your migration process is successful. We can provide professional services to help you design a cloud migration strategy based on best practices, and our managed services team can assist with ongoing monitoring and maintenance. Contact us to learn more about how KeyCore can help you.
Read the full blog posts from AWS
- Designing a successful cloud migration: top five pitfalls and how to avoid a stall
- Observe dynamic sites with Amazon CloudWatch Synthetics and AWS Systems Manager Parameter Store
- Centralize image administration for virtual machines and containers using EC2 Image Builder
AWS for Industries
AWS for Industries
Optimizing HPC Deployments with EC2 Fleet and IBM Spectrum LSF
High performance computing (HPC) workloads are becoming increasingly complex due to the growth of big data, advanced electronic design automation (EDA) for chip design, and high-precision verification. Enterprises are turning to Amazon Web Services (AWS) to meet their compute needs in HPC. According to the Worldwide HPC in the Cloud Forecast 2020–2026 from Hyperion Research (June 2022), AWS is the largest cloud provider for HPC, with 38% market share. AWS offers EC2 Fleet and IBM Spectrum LSF to optimize HPC deployments and enable customers to easily manage complex workloads. EC2 Fleet allows customers to create flexible compute fleets with an even distribution of resources across availability zones, while IBM Spectrum LSF allows customers to manage their compute workloads with an efficient job scheduler.
Tapestry Builds a Scalable IaC Platform with Built-In Governance and Security
Global luxury fashion company Tapestry Inc. (Tapestry) is undergoing a company-wide digital transformation. To support these efforts, Tapestry wanted to modernize their legacy business applications. To do this, Tapestry first completed a lift-and-shift cloud migration to AWS in March 2021. Following this milestone, Tapestry needed to further their cloud modernization efforts and began building a scalable Infrastructure as Code (IaC) platform with built-in governance and security. To do this, Tapestry used the AWS Cloud Development Kit (CDK), an open-source software development framework, to create and deploy IaC. Tapestry also implemented the AWS CloudFormation stack set feature to enable teams to deploy applications across multiple accounts and/or regions.
iFood Modernizes its Financial Middleware to Event-Driven Architecture
The iFood finance department originally used a monolithic application which was slow to develop new functionalities. To solve this, iFood switched to an event-oriented architecture. This accelerated development and added resilience and performance to their financial middleware. The new platform, called Digital by You (DBY), helps food lovers (approx. 100,000 iFood employees) access discounts and offers from local restaurants. To ensure DBY’s functionality, iFood implements AWS Lambda for serverless compute, Amazon EventBridge for event-driven architecture, and Amazon SNS for real-time messaging.
Automated Deployment of 5G RAN and Core Networks using AWS Telco Network Builder
Communication Service Providers (CSPs) are undergoing a digital transformation to deliver new connectivity services to their customers. To do this, CSPs are looking to unify their operating model and automate operations across their stack. AWS Telco Network Builder enables automated deployment of 5G Radio Access Network (RAN) and Core Networks. Through the Network Builder, CSPs can use models to define the architecture of their network, and then deploy and configure the network across multiple AWS regions and accounts. The Network Builder also enables CSPs to define custom orchestration workflows using AWS Step Functions.
University of Michigan Student Team Develops an Energy Efficient Solar Car with High Performance Computing (HPC) on AWS
The University of Michigan’s solar car team is building a car powered only by light rays from the sun to drive across a 3000 km continent in the shortest possible time. To do this, they are using High Performance Computing (HPC) on AWS. The student team uses AWS Batch to execute their HPC workloads due to Batch’s ability to scale their compute resources on demand. With Batch, the team is able to quickly and reliably execute the thousands of simulations needed to build the car. Furthermore, the team is leveraging AWS Compute Optimizer to identify the optimal compute resources for their HPC workloads and reduce costs.
Analyze Data Transfer and Adopt Cost-Optimized Designs to Realize Cost Savings
Programmatic advertising applications leverage large volumes of data (petabyte scale) to deliver personalized experiences to users. Organizations need to have visibility into their data transfer cost to make better decisions on their bidding process. AWS offers different tools and techniques to analyze data transfer and adopt cost-optimized designs. These include AWS Cost Explorer and the AWS Cost and Usage Report, which provide insights into data transfer costs and usage trends. Additionally, AWS Partner Network (APN) Partners like NetApp Cloud Volumes ONTAP and AWS DataSync can be used to optimize data transfer.
AWS Entity Resolution Expands Data Matching Capabilities with LiveRamp, TransUnion, and Unified ID 2.0 Integrations
AWS Entity Resolution has been helping customers match and link related records stored across multiple applications, channels and data stores using advanced matching techniques such as rule-based and machine learning (ML)-powered matching. The service has recently expanded its capabilities with integrations with LiveRamp, TransUnion, and Unified ID 2.0. Using these integrations, customers are able to match people and companies across large datasets for better customer experiences. KeyCore can provide expertise in implementing the AWS Entity Resolution service and integrating the cloud components to power up customer data matching capabilities.
Top 5 Ways Artificial Intelligence and Machine Learning Are Changing Retail
Retailers are facing intense competition and must keep up with technological advancements to remain competitive. AI/ML and eCommerce platforms can help them do this. AI/ML can be used to optimize pricing, personalize product recommendations, automate logistics, and improve customer service. Additionally, generative AI can be used to create virtual product catalogs, product images, and product descriptions. To leverage AI/ML, retailers can use AWS AI Services such as Amazon Personalize, Amazon Forecast, Amazon SageMaker, and Amazon Translate. KeyCore can provide assistance with implementing these services and help retailers take advantage of the latest AI/ML technologies.
Reimagining the Customer Experience with Generative AI – Parts 1 & 2
Generative artificial intelligence (AI) has been receiving attention since March 2023. Generative AI can be used to create human-like responses from chatbots, as well as millions of images, videos, and music produced on demand. In Parts 1 and 2 of this series, Reimagining the Customer Experience with Generative AI, the authors explore how AI can be used to create virtual product catalogs, product images, and product descriptions. Additionally, the authors discuss how business leaders can implement generative AI and how it can drive their business forward.
To learn more about how KeyCore can help businesses leverage AWS services for their industries, contact us today. Our team of AWS experts can provide the latest insights and strategies to help businesses achieve their goals.
Read the full blog posts from AWS
- Optimizing HPC deployments with EC2 Fleet and IBM Spectrum LSF
- Tapestry Builds a Scalable IaC Platform with Built-In Governance and Security
- iFood modernizes its financial middleware to event-driven architecture
- Automated Deployment of 5G RAN and Core Networks using AWS Telco Network Builder
- University of Michigan student team develops an energy efficient solar car with High Performance Computing (HPC) on AWS
- Analyze Data Transfer and adopt cost optimized designs to realize cost savings
- AWS Entity Resolution Expands Data Matching Capabilities with LiveRamp, TransUnion, and Unified ID 2.0 Integrations
- Top 5 ways artificial intelligence and machine learning are changing retail
- Reimagining the customer experience with generative AI – part 2
- Reimagining the customer experience with generative AI – part 1
AWS Marketplace
AWS Marketplace Seller Conference 2023 Recap
The 2023 AWS Marketplace Seller Conference took place on September 12 in Bellevue, Washington. The day-long event brought together independent software vendors (ISVs), data providers (DPs), and channel partners (CPs) that are selling in AWS Marketplace. The conference had four goals: (1) Create a forum for sellers to learn from AWS experts, (2) Share best practices from successful sellers, (3) Connect with other sellers, and (4) Introduce new products and services.
Expert presentations and panel discussions
The conference was filled with educational content from AWS experts, such as product launches and best practices to help sellers succeed. The topics ranged from the basics of selling on AWS Marketplace to advanced selling strategies. The keynote was delivered by AWS GM of Marketplace and Ecosystem, Ken Ehrhart.
The expert presentations and panel discussions were supplemented with a series of smaller breakout sessions focused on specific topics. These included topics such as how to optimize pricing and how to create marketing campaigns that support product launches.
Workshops and networking
The day was capped off with an engaging set of workshops and networking opportunities. The workshops allowed sellers to work with AWS experts to gain insights into the latest products and services. Sellers also had an opportunity to network with each other and build relationships that will help them in their business.
Key Takeaways
The 2023 AWS Marketplace Seller Conference was a great opportunity for independent software vendors, data providers, and channel partners to learn from AWS experts, share best practices, and network with other sellers. The event was filled with educational content and workshops, which will help sellers succeed in their businesses.
How KeyCore can Help
At KeyCore, we offer professional and managed services for AWS Marketplace. Our experts can help you develop, deploy, and manage your products on AWS Marketplace. We can help you create marketing campaigns to drive traffic and optimize pricing for maximum ROI. Our team also provides the necessary support and guidance to ensure your success on AWS Marketplace. Get in touch today to learn more about how we can help you succeed.
Read the full blog posts from AWS
The latest AWS security, identity, and compliance launches, announcements, and how-to posts.
The Latest AWS Security, Identity, and Compliance Launches
IAM Roles Anywhere with an external certificate authority
AWS Identity and Access Management (IAM) Roles Anywhere enables users to utilize temporary Amazon Web Services (AWS) credentials outside of AWS by employing X.509 Certificates from a Certificate Authority (CA). Faraz Angabini dives deeper into this topic in his blog post Extend AWS IAM roles to workloads outside of AWS with IAM Roles Anywhere. Using IAM Roles Anywhere, users can take advantage of the features IAM provides, such as temporary access keys with a limited lifetime, without having to create IAM Users.
AWS Security Profile with Liam Wadman
Liam Wadman, Senior Solutions Architect at AWS Identity, is featured in the AWS Security Profile series. In this profile, Liam shares how he makes informed decisions about risk and reward. In his role at AWS, Liam has been creating, promoting, and supporting security focused products and services for over seven years, and continues to strive to make the cloud a safer place.
Securing generative AI
Generative Artificial Intelligence (AI) has been revolutionizing customer experiences in industries of all sizes. This new-found power is enabled by Multi-Billion Parameter Large Language Models (LLMs) and Transformer Neural Networks. Organizations are starting to take advantage of these new capabilities, but with the new possibilities come new security risks. To help protect against these risks, the AWS Generative AI Security Scoping Matrix was created. This matrix provides a framework to help organizations identify, control, and protect against potential risks and threats.
AWS Cloud Companion Guide for the CSA Cyber Trust mark
AWS has released a Cloud Companion Guide to help customers prepare for the Cyber Trust mark created and maintained by the Cyber Security Agency of Singapore (CSA). This guide provides a mapping of AWS services and features to help customers meet the requirements of the CSA’s Cyber Trust mark. AWS’s Cloud Companion Guide helps customers easily identify and use the features and services that meet their security and compliance requirements.
KeyCore’s Professional Services
At KeyCore, we understand the importance of security and compliance when it comes to AWS. Our professional services team combines their deep knowledge of AWS with years of experience in the industry to provide tailored solutions to meet the needs of any customer. With our experienced and certified AWS consultants, KeyCore is able to help customers navigate the complexities of AWS, secure their environment, and ensure their application meets the necessary compliance requirements.
Read the full blog posts from AWS
- IAM Roles Anywhere with an external certificate authority
- AWS Security Profile: Liam Wadman, Senior Solutions Architect, AWS Identity
- Securing generative AI: An introduction to the Generative AI Security Scoping Matrix
- AWS announces Cloud Companion Guide for the CSA Cyber Trust mark
Front-End Web & Mobile
Unleash the power of AWS AppSync to Query Heterogeneous Data Sources through GraphQL APIs
AWS AppSync and Amazon API Gateway are managed services that allow clients to access resources stored in various data sources. This article shows the advantages of AppSync for external clients who need to access data through GraphQL.
Introduction to GraphQL
GraphQL is an API protocol that enables clients to query data from servers and get exactly what they need, with fewer requests and in a single round trip. It’s particularly useful for applications that need to work across multiple platforms and make different kinds of requests. For example, a mobile app may need to get the latest user information from a database, while an IoT device may need to fetch sensor data from a different source.
Building GraphQL APIs with AWS AppSync
AWS AppSync simplifies GraphQL development by providing a managed solution for creating GraphQL APIs. It allows developers to build powerful GraphQL APIs to securely access multiple data sources with a unified API endpoint. By leveraging AWS Lambda functions, AppSync can provide powerful data sources such as relational databases and object storage, making it easier to access data from multiple sources.
Using AppSync, developers can build robust GraphQL APIs that are secure and scalable. AppSync can handle complex authentication and authorization, allowing developers to protect data from unauthorized access. It also supports multiple concurrent requests without overloading the system, making it ideal for large-scale applications.
Benefits of AWS AppSync
AWS AppSync offers many benefits for developers looking to build GraphQL APIs. It offers a unified endpoint for clients to access data from multiple sources, eliminating the need for multiple API calls. This makes it easier to access all data sources with a single API.
AppSync also provides powerful authentication and authorization features, allowing developers to control access to data sources. This makes it possible to securely access data across multiple sources. In addition, AppSync is highly scalable, making it easy to handle large numbers of concurrent requests without any performance issues.
Using AWS AppSync to Access Data Sources
AWS AppSync simplifies the process of creating GraphQL APIs to access data from multiple sources. It provides a unified endpoint for clients to securely access data from multiple sources and offers powerful authentication and authorization features to protect data from unauthorized access. AppSync also provides scalability, allowing it to handle large numbers of concurrent requests without any performance issues.
How KeyCore Can Help
At KeyCore, we are experts in AWS and offer a range of professional and managed services to help your business get the most out of AWS AppSync. Our team of AWS certified professionals can help you build and deploy powerful GraphQL APIs quickly and securely. With our expertise, you can be sure your GraphQL APIs are secure and scalable, enabling you to access data from multiple sources and provide a unified experience for your users. Contact us today to learn more about how KeyCore can help your business take advantage of the power of AWS AppSync.
Read the full blog posts from AWS
Innovating in the Public Sector
Innovating in the Public Sector
The public sector is constantly looking for new and creative ways to provide their services to citizens. Recently, organizations have started to explore the potential of cloud computing in order to improve their services and reduce costs. In this blog post, we will take a closer look at how the Alberta Motor Association (AMA) and other public sector organizations have adopted cloud computing services, such as AWS, to help them achieve their goals.
Alberta Motor Association Transforms Member Experience with AWS
The Alberta Motor Association (AMA) is a multi-service member-run organization that provides driving education, rewards, roadside assistance, travel, insurance, banking, and many other services. When AMA wanted to launch its community membership, their first true subscription-based membership, they decided to rethink their existing membership system. To do this, they adopted AWS cloud services, providing their members with a smoother, more efficient experience.
Announcing the Data Fabric Security on AWS Solution
Amazon Web Services (AWS) has developed the Data Fabric Security (DFS) on AWS solution to support the identity and access needs of multi-organization systems. The DFS on AWS solution allows federal customers to accelerate joint interoperability, modernization, and data-driven decision making in the cloud by eliminating barriers that prevent systems and users from communicating. At the same time, it also strengthens security by implementing Zero Trust principles.
Meeting Mission Goals by Modernizing Data Architecture with AWS
AWS provides several services that can help public sector organizations modernize their cloud and data architecture. In order to successfully do this, organizations must understand two key concepts – multi-tenancy and data federation. Multi-tenancy enables sharing of resources and costs across multiple users, while data federation is the process of combining data from multiple sources in order to gain new insights. AWS provides services such as Amazon S3, Amazon Athena, and Amazon Elasticsearch Service that enable organizations to apply these concepts and meet their mission goals.
Evaluating Long-Term Value in Migrating ERP and SIS Applications to AWS
Organizations can benefit significantly from migrating their enterprise resource planning (ERP) and student information systems (SIS) to AWS. They can reduce costs, prevent security incidents, and improve agility. Education institutions may face different decision-making contexts when selecting the cloud provider to use. This blog post explores several considerations associated with the decision and the total value that can be realized over the medium and long-term when migrating ERP and SIS applications to AWS.
Generative AI in Education: Building AI Solutions Using Course Lecture Content
Due to the pandemic, e-learning solutions have been increasingly adopted by teachers and students. These solutions have enabled quality education around the world. With the rise of AI, it is now possible to use course lecture content to build AI solutions, such as AI-assisted tutoring systems and automated essay grading. This blog post explores how organizations can take advantage of generative AI in the education sector.
At KeyCore, we provide both professional and managed services supporting organizations in their journey to the cloud. Our experienced team of AWS Certified Solutions Architects can help you assess your cloud needs, develop a migration plan, and implement the necessary solutions to meet your goals. Contact us today to learn more about how we can help your organization in its cloud journey.
Read the full blog posts from AWS
- Alberta Motor Association transforms member experience and optimizes cost on AWS Cloud
- Announcing the Data Fabric Security on AWS solution
- Meeting mission goals by modernizing data architecture with AWS
- Evaluating long-term value in migrating ERP and SIS applications to AWS
- Generative AI in education: Building AI solutions using course lecture content
The Internet of Things on AWS – Official Blog
Building A Scalable Multi-Tenant IoT Platform on AWS
Building a multi-tenant IoT platform on AWS can be a challenging process, as the architecture you choose must be optimized for all of your customer’s scenarios. This blog post outlines an implementation strategy for building such a platform based on real-world customer use cases.
Understanding the Problem
In order to build a multi-tenant IoT platform on AWS, it is important to understand the problem that the platform is trying to solve. Most commonly, customers are looking to provide their users with the ability to manage their IoT devices and the data generated by them. An effective platform must provide customers with the ability to securely manage multiple devices and services simultaneously, while also providing the flexibility to enable and disable different features and services.
A Multi-Account Strategy
The most effective way to achieve the desired result is to use a multi-account strategy. This allows customers to create separate accounts for each tenant, which ensures that each tenant’s data is securely stored and managed. Additionally, customers can use AWS Identity and Access Management (IAM) and AWS Security Token Service (STS) to create roles and policies that enable a single account to securely manage multiple tenants.
Using AWS Services
AWS provides a large number of services that can be used to build a multi-tenant IoT platform. Customers can leverage services such as Amazon Kinesis, AWS IoT Core, and Amazon Athena to process and store device data. Additionally, AWS Lambda and Amazon CloudWatch can be used to create custom applications that can be used to automate tasks such as device management, data analysis, and alerting. Finally, Amazon Cognito can be used to securely authenticate and authorize users and devices.
The Benefits of the Multi-Account Strategy
By using a multi-account strategy, customers are able to securely manage multiple tenants with a single account. Additionally, customers can create separate roles and policies for each tenant, which allows for greater control and flexibility in managing their devices and services. Finally, customers can take advantage of the scalability and reliability of AWS services, which allows them to easily manage large volumes of data and provide a reliable platform for their customers.
The Benefits of AWS Services
Using AWS services provides customers with the scalability and reliability needed to build a multi-tenant IoT platform. Additionally, AWS services provide customers with the flexibility they need to customize their platform to meet their customers’ needs. Finally, customers can take advantage of the security features provided by AWS, which ensures that their data is securely stored and managed.
KeyCore Can Help
At KeyCore, we have extensive experience in building multi-tenant IoT platforms on AWS. We can help customers design and implement the optimal architecture for their platform and ensure their data is securely stored and managed. We can also help customers customize their platform with AWS services, such as Amazon Kinesis, AWS IoT Core, and Amazon Athena, to ensure their customers get the best experience. Contact us today to learn more about how we can help you build a scalable and secure multi-tenant IoT platform on AWS.