At AWS Cloud Day 2020 in Ahmedabad, AWS architects discussed how their Cloud solutions could help startups scale
AWS Cloud Day 2020, held on February 6 in Ahmedabad, brought together AWS solution architects, industry veterans and startups for a discussion on AI/ML, SAP, migration, cloud architecture, and how Amazon’s Cloud solutions are helping startups scale. The event was packed with the latest AWS announcements, great customer stories and breakout sessions for users with different levels of technical proficiency.
Operational excellence at AWS
The event began with the keynote by Matt Fitzgerald, Principal Technical Evangelist, AWS, in which he spoke about how AWS has had a pretty good track record of operational excellence since its genesis. Operational excellence, according to him, is exceeding operational goals, anticipating problems and fixing them before they become issues.
Matt delved into the strong culture of ownership they enforce at AWS and how the leadership principles form the DNA of the company. To achieve operational excellence, AWS follows Jeff Bezos' 'two-pizza' team rule, really good tools and honed processes. The leadership principles at Amazon apply to every individual – right from the CEO to interns. Their tools focus on software deployment which helps Amazon achieve scale. The third aspect is processes. Quoting Jeff Bezos, Matt said, "People always have good intentions. But if good intentions don’t work, good mechanisms do.'"
Since 2006, Amazon has grown from having three offerings to 180 now. "We achieved this through operational excellence i.e. having an operationally-focused culture, rich set of tools, and right processes in place," he said.
Customer success stories
AWS has several success stories of organisations who have leveraged AWS Cloud services and scaled their businesses. Among them is Havmor Ice Cream which started its journey with AWS in early 2018 when the company realised that it had scattered business stakeholders where downtime could not be afforded, and they required a reliable high-performance scalable infrastructure to support their business needs. Partnering with AWS, the ice-cream brand has experienced faster implementation, increased business connects, high performance, reduced capex, enhanced security, faster go-to-market, among others.
"Market share and innovation at scale are the main reasons why we went with AWS," said Dhaval Mankad, Havmor.
Deepkiran Foods is another customer that migrated their entire infrastructure to AWS in 2013. On why they chose AWS, Brijesh Patel says that they had the opportunity to analyse various cloud computing services, but AWS stood out. "Since we follow a project management policy, it was easy to migrate and use, and it was more flexible, scalable and secure." The partnership reduced their downtime and enabled their applications to develop much faster.
OpenXcell uses several AWS services like Amazon CloudFront, Amazon S3, AWS Lambda, Amazon ECS, among others. The team said that AWS has provided them with a fully-flexible environment where they can set up any operating system. The company has benefited from the highly secure environment, scalable performance, cost of ownership and process optimisation that AWS offers.
AWS AI and ML services
The next session by Abhishek Mahanty, Senior Solutions Architect, AWS, covered the Artificial Intelligence (AI) and Machine Learning (ML) services provided by AWS. "AI and ML are super popular now because it's the centrepiece for digital transformation," he said.
Abhishek spoke about how the mission of AWS is to put ML in the hands of every developer. They have built three layers - AI services for application developers, ML services to build ML models, and ML framework and infrastructure for ML experts and practitioners.
"We realised we needed to have a proper stack, with which developers with various maturity, skill level, and expertise could relate to," he said.
AWS is the preferred platform to leverage ML as it’s built on the most comprehensive cloud platform and provides the broadest and deepest set of services. Last year, over 200 new features and services were launched. Apart from that, Amazon SageMaker helps simplify the way you build and develop training models.
AWS Cloud economics
Cloud is good, but is it really cost-effective? In the next session, Harshad Satam, Manager - Territory Business, AWS, emphasized on financial management in the cloud.
Cloud projects, he said, unlike other projects, are not procured every one or three years, rather every 15 days, and it’s important to keep the cost of existing projects under control. "Since the cost of failure is low with cloud, you essentially encourage innovation, but keep cost under check," he said.
Harshad spoke about the four pillars to manage cloud cost - See, Save, Plan and Run. Some of the questions to consider are – “Users are growing, but is your cost per transaction coming down? Are you measuring and making teams accountable for what they're using? Are you using the right pricing models and services? Are you tagging resources that cost a lot?" and so on.
AWS provides tools that help businesses forecast and ensure it as a mechanism to keep things in control.
"You can automate guard rails and put project-based budgets with AWS,” he said, adding, "If you take your workload lift and shift to AWS on-demand, it will always be expensive, but that doesn’t mean that cloud is expensive. If you do it the right way, by talking to solution architects and certified partners, they can guide you."
AWS Cloud security and governance
“Security is not one person’s job, organisations have to ensure that everyone contributes to it," said Premal Gandhi, Security Solutions Architect, AWS as he took the next session on security and governance on AWS Cloud.
Customers have different conceptions when cloud comes to mind, on who would own the security aspect. Understanding this challenge well in advance, AWS documented a Shared Responsibility Model (SRM) between AWS Cloud and the end customer. "Security is top priority for us. That’s the only way customers will trust us and continue using us to innovate," he said.
AWS is responsible for the entire infrastructure management, but securing the application is the customer’s responsibility. On why that is so, Premal says that while customers move their application to cloud, they would like to retain the ownership and control of data, which is a regulatory requirement. AWS's SRM gives you the freedom to store and access data your way. On behalf of the customer, AWS also hires a third-party auditor to audit if their infrastructure is in line with industry best practices.
Besides this, AWS being a customer-driven company, provides a host of encryption services and capability based on customer feedback.
Building a SaaS/ISV solution on AWS
Abhishek Mahanty also took a session on why an increasing number of SaaS players are building their products on AWS. He said that there's a lot of commonality between what SaaS and AWS has to offer, including pay-for-what-you-use, work with resources on-demand, build highly durable services, scalability, and so on.
Three main reasons why you should consider building your SaaS solution on AWS is because of its flexibility, subscription-based usage and innovative services.
Abhishek spoke about what makes SaaS applications different from the regular application. In SaaS, you have multiple user tenants and AWS helps you focus on your core IP and not on ancillary aspects. "Scale and security are important, but we bring the tenancy flavour to the conversation," he said.
Migrating SAP and purpose-built database on AWS
“Customers spend a lot of time and energy on where to run their SAP as it’s a sizable capex investment in most cases,” said Harshad Satam in his next session, as he highlighted what aspects to consider while moving SAP workloads to AWS.
Compared to other platforms within the market, Harshad said that with the virtue of being the first cloud provider to partner with SAP, AWS stands out in terms of the number of partners, market performance and certifications.
He also delved into a purpose-built database for modern applications. "Scale, performance and availability are the three parameters you should look at when checking what kind of database fits your requirement." Every use case requires a different kind of database, and AWS has created services like Amazon Aurora, Relational Database Service (RDS), and so on.
Containers, serverless and data lakes in AWS
The event concluded with a talk by Saira Shaik, Senior Technical Account Manager, AWS, on containers and serverless on AWS. "Containers and serverless has become popular because you're pushing all the applications together in production," she said.
Since developers use a lot of tools, they wonder how fast they can execute it using a command line. The AWS CLI 2.0 launched at AWS re:Invent last year enables you to entirely configure your application. You can define the dependencies and pipeline, and deploy it into production. "It will deploy all the code as per the best practices, test for the best practices and create the whole pipeline. That’s the advantage of CLI 2.0," she said.
She spoke about the operational model in AWS. As customers have several aspects to take care of including capacity, cost, security, compliance, competitors, they didn’t want to burden them with operations. They realised that customers want to build the application, and not the infrastructure, and AWS takes care of that by ensuring that your application guides the infrastructure.
Saira also delved into data lakes and analytics on AWS and why you need it. Because of the explosion of data from multiple sources, the challenge is how much to store, who can access it, how do we present the data to the right audience, how accurate is it, and so on. Data lakes solves this by ensuring that data is in silos. They bring the data together to analyse and create predictions so that businesses are better prepared.
"The future of cloud is with data lakes," concluded Saira.