Home

25Feb 2019

For the organizations to stay competitive in today’s technology world, have to think of ways to keep their infrastructure automated, highly available, flexible, reproducible, scalable for high productivity and reliability.

In this blog, we will see how the AWS powered NextGen Infrastructure as a code(IaC) helps us to achieve organizations objective compared to the traditional infrastructure as code (IaC).

Traditional Infrastructure as code:

The traditional IaC is to enable and manage the data centers, storage, networking manually. The respective admin will set up the disk, install operating systems and applications. The period required before the launch could be days or weeks. Not only it is time-consuming, but it consumes a big chunk of the workforce plus the higher cost. Imagine of hardware failure, the time required to wait for the manufacturer production, ship, and delivery. What if the hardware malfunctions after all the waiting period. Again the business had to wait for the subject matter expert to handle the situation.

NextGen Infrastructure as code:

With AWS powered DevOps Infrastructure as a code (IaC), we can automate the entire infrastructure setup. How easy does it sound? Simply put, IaC is to manage and provision the infrastructure through the code which pushes into the operational environment. The whole process flow of the development and test can deal with the complexity of the hybrid IT platform. With the NextGen Infrastructure as Code, the MSP can automate, reproduce the systems, and self-document the entire infrastructure. How easy will it be for anyone in the team, different teams, and the developers?

With the traditional IaC, flexibility, elasticity, scalability, reproducibility was a dream for the stakeholders. With the principles of NextGen IaC and AWS DevOps,  it makes it easier for collaboration and automation. It has become easier to build custom templates, configure repeatable changes, deploy as a single service or as a group. We can automate the scaling resources based on the traffic.  

The critical aspect of infrastructure is disaster recovery and backup. The traditional backup runs on fixed time intervals. Imagine, if there is a failure or network latency, it could lead to data loss that could affect the productivity and reliability of the organization. Cloud Computing has made it so simple and flexible for organizations to enhance data protection, easy deployment, and cost efficiency.

With the right back up strategy and predefined templates,  we can implement cross region backups and recovery through automation. Using reliable AWS services like S3 and Direct Connect we can sync the backup solutions at defined regular intervals. For Business Continuity Planning/Disaster Recovery (BCP/DR), we can implement CloudFormation templates for ease of use to make a highly reliable, available, and scalable or upgradeable AWS infrastructure.

25Feb 2019

Machine Learning (ML) and Artificial Intelligence(AI) are two hot catchphrases in the technology arena. ML is the subset of AI, based on the idea of providing data to machines and let them learn for themselves.

With the AWS powered ML and AI, the NextGen MSP can provide scalable infrastructure,  and deploy solutions through machine learning platforms for seamless deployment and consolidated billing.  The ML and AI positioned Enterprise Architecture for the businesses, provides faster analytics, decision making, more interaction between technology and business, reliability, and leverage for creative inexistence services.

We have frameworks for launching Infrastructure, Software, Network, and Applications. The Open Group Architecture Framework is all about the delivery part. Let’s look at the importance of Enterprise Architecture and the comparison between Traditional and NextGen Open Group Architecture.

Enterprise Architecture methodology is critical to align the concerns between IT and Business. Enterprise Architecture is the core behind any organizations productivity, agility, service, growth in revenue and cost efficiency.

The Traditional Enterprise Architecture rely upon one operating model and emphasis interdependency. For an enterprise, there will be a mix of multiple frameworks which is a long term commitment with continuous improvement.

The NextGen Enterprise Architecture methodology is a pluggable architecture comprising of dynamic compute resources, common storage platform, flexible programming, real-time support, and managing deployment. The NextGen Architecture model is a business focused model that combines both enterprise architecture and business architecture, business process management, and decision management.

The core features of NextGen Architecture is Instant customization of Network parser, application of complex rules to live network traffic, unlimited scalability and captures everything in the infrastructure, threat feeds and API

The NextGen Architecture is to communicate in real time, for that 90% of the running applications, software and servers have to be automated completely. It empowers the businesses to have a high level of flexibility, activity monitoring and actionable insights on the cost utilization. It integrates and automates solutions that enable users to plug and play experience.

The AWS powered billing and cost management ensures you pay for what you use. The AWS provides features to monitor the usage, along with the pricing calculator which could be utilized to create price estimates. The AWS has a very transparent pricing model which helps the businesses to allocate the respective budget for cloud computing.

05Sep 2018

Amazon SageMaker now supports version 1.10 in its pre-built TensorFlow containers. This makes it easier to run TensorFlow scripts, while taking advantage of the capabilities Amazon SageMaker offers, including a library of high-performance algorithms, managed and distributed training with automatic model tuning, one-click deployment, and managed hosting.

05Sep 2018

AWS CloudFormation Macros perform custom processing on CloudFormation templates from simple actions such as find-and-replace to transformation of entire templates. CloudFormation Macros use the same technology that powers AWS::Include and AWS::Serverless transforms. CloudFormation transforms help simplify template authoring by condensing the expression of AWS infrastructure as code and enabling reuse of template components.

Previously, you could use AWS::Include and AWS::Serverless transforms to process your templates that were hosted by CloudFormation. Now, you can use CloudFormation Macros to create your own custom transforms. For example, you can create common string functions for templates or define short-hand syntaxes for common CloudFormation resources. Click here to learn more about sample macros for your reference.

To learn more about CloudFormation Macros, please visit AWS CloudFormation documentation.

CloudFormation Macros are available in all AWS regions that have AWS Lambda. For a full list of AWS regions where AWS Lambda is available, please visit our Region table.

05Sep 2018

Starting today, you can enable persistent application and Windows settings for your users on AppStream 2.0. With this launch, your users’ plugins, toolbar settings, browser favorites, application connection profiles, and other settings will be saved and applied each time they start a streaming session. For example, your users can configure their plugins and toolbars for their CAD/CAM applications, and retain those settings every time they stream their application. Your users’ settings are stored in an S3 bucket you control in your AWS account.

To get started, select Stacks from the AppStream 2.0 console. Below the stacks list, choose User Settings, Application Settings Persistence, Edit. In the Application Settings Persistence dialog box, choose Enable Application Settings Persistence. To learn more about persistent application settings, see Enable Application Settings Persistence for Your AppStream 2.0 Users.

You can enable persistent application settings for your users at no additional charge in all AWS Regions where AppStream 2.0 is offered. However, you will be billed for the S3 storage used to store your user’s settings data. To use this feature, the AppStream 2.0 agent software on your image must be dated August 29, 2018 or newer. AppStream 2.0 offers pay-as-you-go pricing. Please see Amazon AppStream 2.0 Pricing for more information, and try our sample applications.

04Sep 2018

AWS Config, a service that enables you to assess, audit, and evaluate the configurations of your AWS resources, announces seven new managed rules to help you evaluate whether your AWS resource configurations comply with common best practices. This allows you to simplify compliance auditing, security analysis, change management, and operational troubleshooting.

04Sep 2018

Amazon S3 announces feature enhancements to S3 Select. S3 Select is an Amazon S3 capability designed to pull out only the data you need from an object, which can dramatically improve the performance and reduce the cost of applications that need to access data in S3.

Today, Amazon S3 Select works on objects stored in CSV and JSON format. Based on customer feedback, we’re happy to announce S3 Select support for Apache Parquet format, JSON Arrays, and BZIP2 compression for CSV and JSON objects. We are also adding support for CloudWatch Metrics for S3 Select, which lets you monitor S3 Select usage for your applications. 

Google+