Buy Me a Coffee

Wednesday, October 18, 2017

AWS re:invent recommended sessions


Here is a list of recommended sessions to the next AWS Re:Invent which will take place Nov. 2017 in Las Vegas




Text Weekday Date Start time End time Duration Day accumulated Location Description
REGISTRATION, HELP DESK AND SWAG Sunday 11/26/2017 1:00 PM 10:00 PM 9:00 9:00 Aria, 3730 Las Vegas Fwy, Las Vegas, NV 89109, USA REGISTRATION, HELP DESK AND SWAG
ARIA, MGM GRAND, MIRAGE, THE VENETIAN
ROBOCAR RALLY MIXER Sunday 11/26/2017 6:00 PM 10:00 PM 4:00 13:00 Aria, 3730 Las Vegas Fwy, Las Vegas, NV 89109, USA ROBOCAR RALLY MIXER
ARIA
IOT321 - Continuous Integration and Continuous Deployment for IoT Using Docker Monday 11/27/2017 10:00 AM 11:00 AM 1:00 1:00 Aria, 3730 Las Vegas Fwy, Las Vegas, NV 89109, USA IOT321 - Continuous Integration and Continuous Deployment for IoT Using Docker
IoT continuous integration and continuous deployment (CI/CD) presents three unique challenges—for different operating systems and architectures, testing and validation, and constrained resources on devices. In this session, we discuss how customers can implement a CI/CD process using AWS IoT and Docker. This session includes a demo of this system working end to end, and we also discuss common customer questions when implementing CI/CD and Docker on devices.
Monday, Nov 27, 10:00 AM - 11:00 AM – Aria, Level 1, Bluethorn 2
CON206 - Docker on AWS Monday 11/27/2017 10:45 AM 11:45 AM 1:00 2:00 Aria, 3730 Las Vegas Fwy, Las Vegas, NV 89109, USA CON206 - Docker on AWS
In this session, Docker Technical Staff Member Patrick Chanezon will discuss how Finnish Rail, the national train system for Finland, is using Docker on Amazon Web Services to modernize their customer facing applications, from ticket sales to reservations. Patrick will also share the state of Dockerdevelopment and adoption on AWS, including explaining the opportunities and implications of efforts such as Project Moby, Docker EE, and how developers can use and contribute to Docker projects.
Monday, Nov 27, 10:45 AM - 11:45 AM – Aria, Level 3, Juniper 3
MAE403 - OTT: State of Play: Lessons Learned from the Big 3: Netflix, Hulu, and Amazon Video Monday 11/27/2017 1:00 PM 2:00 PM 1:00 3:00 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA MAE403 - OTT: State of Play: Lessons Learned from the Big 3: Netflix, Hulu, and Amazon Video
Every evening video streaming consumes over 70% of the internet’s bandwidth, with demand only expected to increase as young households forego traditional pay TV for OTT services (whether live, on-demand, ad-supported, transactional, subscription, or a combination thereof). In this session senior tech architects from Netflix, Hulu, and Amazon Video discuss lessons and best practices around hosting largest scale video distribution workloads to enable high traffic consumption at demanding reliability requirements. We will dive deep into using AWS Compute Services more effectively for video processing workloads, using the AWS network for large scale content distribution, as well as using AWS Storage services for actively managing large content libraries.
Monday, Nov 27, 1:00 PM - 2:00 PM – Venetian, Level 4, Delfino 4005
DAT202 - Getting Started with Amazon Aurora Monday 11/27/2017 1:45 PM 2:45 PM 1:00 4:00 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA DAT202 - Getting Started with Amazon Aurora
Amazon Aurora services are MySQL and PostgreSQL -compatible relational database engines with the speed, reliability, and availability of high-end commercial databases at one-tenth the cost. This session introduces you to Amazon Aurora, explores the capabilities and features of Aurora, explains common use cases, and helps you get started with Aurora.
Monday, Nov 27, 1:45 PM - 2:45 PM – Venetian, Level 4, Marcello 4405
BCP04 - AWS Certification Exam Readiness: DevOps Engineer – Professional (Tuesday) Tuesday 11/28/2017 9:00 AM 5:00 PM 8:00 8:00 The Mirage, 3400 S Las Vegas Blvd, Las Vegas, NV 89109, USA BCP04 - AWS Certification Exam Readiness: DevOps Engineer – Professional (Tuesday)
The AWS Certified DevOps Engineer – Professional exam validates technical expertise in provisioning, operating, and managing distributed application systems on the AWS platform. Join this full-day, advanced-level bootcamp to learn how to prepare for the exam by exploring the exam’s topic areas and how they map to DevOps on AWS and to specific areas to study. We will review sample exam questions in each topic area and teach you how to interpret the concepts being tested so that you can more easily eliminate incorrect responses. This bootcamp covers the core principles of the DevOps methodology and examines a number of use cases applicable to startup, small and medium-sized business, and enterprise development scenarios. Attendees also receive a voucher for a free online practice exam.
Tuesday, Nov 28, 9:00 AM - 5:00 PM – Mirage, Grand Ballroom E
CMP204 - How Netflix Tunes Amazon EC2 Instances for Performance Tuesday 11/28/2017 11:30 AM 12:30 PM 1:00 9:00 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA CMP204 - How Netflix Tunes Amazon EC2 Instances for Performance
At Netflix, we make the best use of Amazon EC2 instance types and features to create a high- performance cloud, achieving near bare-metal speed for our workloads. This session summarizes the configuration, tuning, and activities for delivering the fastest possible EC2 instances, and helps you improve performance, reduce latency outliers, and make better use of EC2 features. We show how to choose EC2 instance types, how to choose between Xen modes (HVM, PV, or PVHVM), and the importance of EC2 features such SR-IOV for bare-metal performance. We also cover basic and advanced kernel tuning and monitoring, including the use of Java and Node.js flame graphs and performance counters.
Tuesday, Nov 28, 11:30 AM - 12:30 PM – Venetian, Level 2, Titian 2204
ABD218 - How EuroLeague Basketball Uses IoT Analytics to Engage Fans Tuesday 11/28/2017 1:00 PM 2:00 PM 1:00 10:00 Aria, Las Vegas Freeway, Las Vegas, NV, United States ABD218 - How EuroLeague Basketball Uses IoT Analytics to Engage Fans

IoT and big data have made their way out of industrial applications, general automation, and consumer goods, and are now a valuable tool for improving consumer engagement across a number of industries, including media, entertainment, and sports. The low cost and ease of implementation of AWS analytics services and AWS IoT have allowed AGT, a leader in IoT, to develop their IoTA analytics platform. Using IoTA, AGT brought a tailored solution to EuroLeague Basketball for real-time content production and fan engagement during the 2017-18 season. In this session, we take a deep dive into how this solution is architected for secure, scalable, and highly performant data collection from athletes, coaches, and fans. We also talk about how the data is transformed into insights and integrated into a content generation pipeline. Lastly, we demonstrate how this solution can be easily adapted for other industries and applications.
Tuesday, Nov 28, 1:00 PM - 2:00 PM – Aria, Level 1, Pinyon 2
ARC303 - Running Lean Architectures: How to Optimize for Cost Efficiency Tuesday 11/28/2017 2:30 PM 3:30 PM 1:00 11:00 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA ARC303 - Running Lean Architectures: How to Optimize for Cost Efficiency
Whether you’re a cash-strapped startup or an enterprise optimizing spend, it pays to run cost-efficient architectures on AWS. This session reviews a wide range of cost planning, monitoring, and optimization strategies, featuring real-world experience from AWS customers. We cover how to effectively combine Amazon EC2 On-Demand, Reserved, and Spot Instances to handle different use cases; leveraging Auto Scaling to match capacity to workload; and choosing the optimal instance type through load testing. We discuss taking advantage of tiered storage and caching, offloading content to Amazon CloudFront to reduce back-end load, and getting rid of your back end entirely by serverless. Even if you already enjoy the benefits of serverless architectures, we show you how to select the optimal AWS Lambda memory class and how to maximize networking throughput in order to minimize Lambda run-time and therefore execution cost. We also showcase simple tools to help track and manage costs, including Cost Explorer, billing alerts, and AWS Trusted Advisor. This session is your pocket guide for running cost effectively in the AWS Cloud.
Tuesday, Nov 28, 2:30 PM - 3:30 PM – Venetian, Level 2, Venetian E
ARC207 - Monitoring Performance of Enterprise Applications on AWS: Understanding the Dynamic Nature of Cloud Computing Tuesday 11/28/2017 3:15 PM 4:15 PM 1:00 12:00 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA ARC207 - Monitoring Performance of Enterprise Applications on AWS: Understanding the Dynamic Nature of Cloud Computing
Applications running in a typical data center are static entities. But applications aren't static in the cloud. Dynamic scaling and resource allocation is the norm on AWS. Technologies such as Amazon EC2, AWS Lambda, and Auto Scaling provide flexibility in building dynamic applications and with this flexibility comes an opportunity to learn how an enterprise application functions optimally.  New Relic helps manage these applications without sacrificing simplicity.  In this session, we discuss changes in monitoring dynamic cloud resources. We'll share best practices we’ve learned working with New Reliccustomers on managing applications running in this environment to understand and optimize how they are performing. Session sponsored by New Relic
Tuesday, Nov 28, 3:15 PM - 4:15 PM – Venetian, Level 4, Delfino 4002
WELCOME RECEPTION Tuesday 11/28/2017 5:00 PM 7:00 PM 2:00 14:00 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA WELCOME RECEPTION
THE VENETIAN, THE LINQ LOT
DEM31 - Remove Bottlenecks in Your CI/CD Pipelines Using CloudBees Jenkins Enterprise and AWS Wednesday 11/29/2017 12:00 PM 12:15 PM 0:15 0:15 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA DEM31 - Remove Bottlenecks in Your CI/CD Pipelines Using CloudBees Jenkins Enterprise and AWS
This session demonstrates the key benefits of running 'CloudBees Jenkins Enterprise' on AWS. Enterprises can onboard development projects and spin up the resources to run their CI/CD pipelines in minutes. CloudBees Jenkins Enterprise can intelligently scale in & out based on workloads, thus controlling infrastructure costs, and removing typical bottle necks from the software delivery process. Session sponsored by Cloudbees
Wednesday, Nov 29, 12:00 PM - 12:15 PM – Venetian, Level 1, Expo Hall Day 1 Theater
ENT208 - From Cloud Cost Management to Financial Agility: The Journey to Success Wednesday 11/29/2017 2:30 PM 3:30 PM 1:00 1:15 MGM Grand Monorail Station, 3799 S Las Vegas Blvd, Las Vegas, NV 89109, USA ENT208 - From Cloud Cost Management to Financial Agility: The Journey to Success
Cloud is the new normal - it continues to deliver amazing new technologies and drive us to innovate our operational models. Cloud also gives us new capabilities to see the tremendous financial impact to our businesses. As you journey into the cloud, your teams will need to wield the cloud power responsibly. They will need more visibility into the financial impact of their resource usage in order to make the best decisions for today and for the longer term. Join Cloudability and HERE Technologies as we guide you through the journey to financial agility on the cloud as we have seen it play it out in hundreds of customers. You’ll walk away with the steps you need to take and the mindset your teams will need to adopt in order to achieve financial agility. Topics include: - Discovering and validating your goals (needs, requirements, responsibilities) - Analyzing and closing the gaps between your goals and your reality - Developing a new operational model that provides the financial agility and predictability - HERE Technologies case study Session Sponsored by Cloudability
Wednesday, Nov 29, 2:30 PM - 3:30 PM – MGM, Level 1, Grand Ballroom 124
DEV307 - Mastering the AWS CLI Wednesday 11/29/2017 4:45 PM 5:45 PM 1:00 2:15 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA DEV307 - Mastering the AWS CLI
The AWS CLI provides an easy-to-use command line interface to AWS and allows you to create powerful automation scripts. This talk will focus on how to utilize the advanced features of the AWS CLI to master common workflows when managing AWS resources.
Wednesday, Nov 29, 4:45 PM - 5:45 PM – Venetian, Level 5, Palazzo P
PUB CRAWL Wednesday 11/29/2017 5:30 PM 7:30 PM 2:00 4:15 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA PUB CRAWL
MGM GRAND & THE VENETIAN
DEV209 - A Field Guide to Monitoring on the AWS Cloud: from Lift and Shift to AWS Lambda Thursday 11/30/2017 11:30 AM 12:30 PM 1:00 1:00 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA DEV209 - A Field Guide to Monitoring on the AWS Cloud: from Lift and Shift to AWS Lambda
Static applications living on long-running servers and assumptions about monitoring are becoming history on the cloud. Now, we routinely deploy a range of services, from automatic scaling to decoupled message queues to serverless applications. These dynamic services, with microservices architectures, can break or coexist with traditional monitoring and instrumentation approaches. Whether you’re building new apps, were told to migrate yesterday, are currently migrating, or are already scaling your apps on AWS, this session dives into the how, where, and when to monitor your applications and infrastructure, no matter where your apps run. Also hear best practices we've learned from our customers, and from running our own service (1.5 billion+ metrics per minute). Join us for a little bit of history and a whole lot of now as we show you how and what you need to scale and prove your success on the AWS Cloud.   Session sponsored by New Relic
* Thursday, Nov 30, 11:30 AM - 12:30 PM – Venetian, Level 5, Palazzo N
ENT338 - Automate Best Practices and Operational Health for AWS Resources with AWS Trusted Advisor and AWS Health Thursday 11/30/2017 12:15 PM 2:45 PM 2:30 3:30 MGM Grand Monorail Station, 3799 S Las Vegas Blvd, Las Vegas, NV 89109, USA ENT338 - Automate Best Practices and Operational Health for AWS Resources with AWS Trusted Advisor and AWS Health
It can be challenging to optimize AWS resources across cost, performance, security, and fault tolerance, much less do it automatically. AWS Trusted Advisor, an online resource, provides real-time guidance to help you provision your resources following AWS best practices. AWS Health provides ongoing visibility into the state of your AWS resources and remediation guidance for resource performance or availability issues that may affect your applications. Learn how to safely automate these best practices using Amazon CloudWatch Events and AWS Lambda, with samples for you to use. We also introduce you to AWS Health tools, a community-based source of tools to automate remediation actions and customize health alerts. See how to automate AWS best practices from Trusted Advisor and implement remediation from the AWS Health API on your AWS resources. Attendees should bring their own laptops.
* Thursday, Nov 30, 12:15 PM - 2:45 PM – MGM, Level 3, Premiere Ballroom 318
DEV301 - Dev Tested, Ops Approved: 10 Guardrails from Atlassian for Better, Faster DevOps Thursday 11/30/2017 1:45 PM 2:45 PM 1:00 4:30 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA DEV301 - Dev Tested, Ops Approved: 10 Guardrails from Atlassian for Better, Faster DevOps
Over the years, Atlassian's engineering teams have developed a set of proven and dependable DevOps practices that have allowed us to increase velocity and ship more reliably. Like many of you, Atlassian is grappling with complex, distributed teams; ever-increasing demand on our products and services; and a greater need than ever for a fast, stable release cadence and reliable uptime. This year, we're going to be sharing 10 of our dev tested, ops approved practices with you. In this session, we discuss: how Atlassian tools integrate with AWS to break down silos, increase development speed, and minimize system outages; how to scale the DevOps basics, from building a culture of collaboration to quadrupling your release cadence; and how to track the business value of what you're building, using, deploying, and repairing. Session Sponsored by Atlassian.
Thursday, Nov 30, 1:45 PM - 2:45 PM – Venetian, Level 4, Lando 4202
ABD401 - How Netflix Monitors Applications in Real Time with Amazon Kinesis Thursday 11/30/2017 4:00 PM 5:00 PM 1:00 5:30 Aria, 3730 Las Vegas Fwy, Las Vegas, NV 89109, USA -ABD401 - How Netflix Monitors Applications in Real Time with Amazon Kinesis
Thousands of services work in concert to deliver millions of hours of video streams to Netflix customers every day. These applications vary in size, function, and technology, but they all make use of the Netflixnetwork to communicate. Understanding the interactions between these services is a daunting challenge both because of the sheer volume of traffic and the dynamic nature of deployments. In this session, we first discuss why Netflix chose Kinesis Streams to address these challenges at scale. We then dive deep into how Netflix uses Kinesis Streams to enrich network traffic logs and identify usage patterns in real time. Lastly, we cover how Netflix uses this system to build comprehensive dependency maps, increase network efficiency, and improve failure resiliency. From this session, youl learn how to build a real-time application monitoring system using network traffic logs and get real-time, actionable insights.
Thursday, Nov 30, 4:00 PM - 5:00 PM – Aria, Level 3, Juniper 4
ARC401 - Serverless Architectural Patterns and Best Practices Thursday 11/30/2017 4:45 PM 5:45 PM 1:00 6:30 The Venetian, 3355 S Las Vegas Blvd, Las Vegas, NV 89109, USA ARC401 - Serverless Architectural Patterns and Best Practices
As serverless architectures become more popular, customers need a framework of patterns to help them identify how they can leverage AWS to deploy their workloads without managing servers or operating systems. This session describes re-usable serverless patterns while considering costs. For each pattern, we provide operational and security best practices and discuss potential pitfalls and nuances. We also discuss the considerations for moving an existing server-based workload to a serverless architecture. The patterns use services like AWS Lambda, Amazon API Gateway, Amazon Kinesis Streams, Amazon Kinesis Analytics, Amazon DynamoDB, Amazon S3, AWS Step Functions, AWS Config, AWS X-Ray, and Amazon Athena. This session can help you recognize candidates for serverless architectures in your own organizations and understand areas of potential savings and increased agility. What’s new in 2017: using X-Ray in Lambda for tracing and operational insight.; a pattern on high performance computing (HPC) using Lambda at scale; how a query can be achieved using Athena; Step Functions as a way to handle orchestration for both the Automation and Batch patterns; a pattern for Security Automation using AWS Config rules to detect and automatically remediate violations of security standards; how to validate API parameters in API Gateway to protect your API back-ends; and a solid focus on CI/CD development pipelines for serverless –that includes testing, deploying, and versioning (SAM tools).
Thursday, Nov 30, 4:45 PM - 5:45 PM – Venetian, Level 2, Venetian Theatre
re:PLAY PARTY Thursday 11/30/2017 8:00 PM 12:00 AM 4:00 10:30 The Linq Hotel & Casino, 3535 S Las Vegas Blvd, Las Vegas, NV 89109, USA re:PLAY PARTY
THE PARK AT THE LINQ LOT
Total: 45:45 (45 hours 45 min) Hours: 45.75









Tuesday, October 17, 2017

How To Install Jenkins on Ubuntu 16.04

Introduction

Jenkins is an open source automation server intended to automate repetitive technical tasks involved in the continuous integration and delivery of software. Jenkins is Java-based and can be installed from Ubuntu packages or by downloading and running its Web application ARchive (WAR) file — a collection of files that make up a complete web application which is intended to be run on a server.
In this tutorial we will install Jenkins by adding its Debian package repository, then using that repository to install the package using apt-get.

Prerequisites

To follow this tutorial, you will need:
One Ubuntu 16.04 server configured with a non-root sudo user and a firewall by following the Ubuntu 16.04 initial server setup guide. We recommend starting with at least 1 GB of RAM. See Choosing the Right Hardware for Masters for guidance in planning the capacity of a production Jenkins installation.
When the server is set up, you're ready to follow along.

Step 1 — Installing Jenkins

The version of Jenkins included with the default Ubuntu packages is often behind the latest available version from the project itself. In order to take advantage of the latest fixes and features, we'll use the project-maintained packages to install Jenkins.
First, we'll add the repository key to the system.
  • wget -q -O - https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add -
When the key is added, the system will return OK. Next, we'll append the Debian package repository address to the server's sources.list:
  • echo deb http://pkg.jenkins.io/debian-stable binary/ | sudo tee /etc/apt/sources.list.d/jenkins.list
When both of these are in place, we'll run update so that apt-get will use the new repository:
  • sudo apt-get update
Finally, we'll install Jenkins and its dependencies, including Java:
  • sudo apt-get install jenkins
Now that Jenkins and its dependencies are in place, we'll start the Jenkins server.

Step 2 — Starting Jenkins

Using systemctl we'll start Jenkins:
sudo systemctl start jenkins
Since systemctl doesn't display output, we'll use its status command to verify that it started successfully:
  • sudo systemctl status jenkins
If everything went well, the beginning of the output should show that the service is active and configured to start at boot:
Output
● jenkins.service - LSB: Start Jenkins at boot time Loaded: loaded (/etc/init.d/jenkins; bad; vendor preset: enabled) Active:active (exited) since Thu 2017-04-20 16:51:13 UTC; 2min 7s ago Docs: man:systemd-sysv-generator(8)
Now that Jenkins is running, we'll adjust our firewall rules so that we can reach Jenkins from a web browser to complete the initial set up.

Step 3 — Opening the Firewall

By default, Jenkins runs on port 8080, so we'll open that port using ufw:
  • sudo ufw allow 8080
We can see the new rules by checking UFW's status.
  • sudo ufw status
We should see that traffic is allowed to port 8080 from anywhere:
Output
Status: active To Action From -- ------ ---- OpenSSH ALLOW Anywhere 8080 ALLOW Anywhere OpenSSH (v6) ALLOW Anywhere (v6) 8080 (v6) ALLOW Anywhere (v6)
Now that Jenkins is installed and the firewall allows us to access it, we can complete the initial setup.

Step 3 — Setting up Jenkins

To set up our installation, we'll visit Jenkins on its default port, 8080, using the server domain name or IP address: http://ip_address_or_domain_name:8080
We should see "Unlock Jenkins" screen, which displays the location of the initial password
Unlock Jenkins screen
In the terminal window, we'll use the catcommand to display the password:
  • sudo cat /var/lib/jenkins/secrets/initialAdminPassword
We'll copy the 32-character alphanumeric password from the terminal and paste it into the "Administrator password" field, then click "Continue". The next screen presents the option of installing suggested plugins or selecting specific plugins.
Customize Jenkins Screen
We'll click the "Install suggested plugins" option, which will immediately begin the installation process:
Jenkins Getting Started Install Plugins Screen
When the installation is complete, we'll be prompted to set up the first administrative user. It's possible to skip this step and continue as admin using the initial password we used above, but we'll take a moment to create the user.
Note: The default Jenkins server is NOT encrypted, so the data submitted with this form is not protected. When you're ready to use this installation, follow the guide How to Configure Jenkins with SSL using an Nginx Reverse Proxy. This will protect user credentials and information about builds that are transmitted via the Web interface.
Jenkins Create First Admin User Screen
Once the first admin user is in place, you should see a "Jenkins is ready!" confirmation screen.
Jenkins is ready screen
Click "Start using Jenkins" to visit the main Jenkins dashboard:
Welcome to Jenkins Screen
At this point, Jenkins has been successfully installed.

Conclusion

In this tutorial, we've installed Jenkins using the project-provided packages, started the server, opened the firewall, and created an administrative user. At this point, you can start exploring Jenkins.
When you've completed your exploration, if you decide to continue using Jenkins, follow the guide, How to Configure Jenkins with SSL using an Nginx Reverse Proxy in order to protect passwords, as well as any sensitive system or product information that will be sent between your machine and the server in plain text.


Source : https://www.digitalocean.com/community/tutorials/how-to-install-jenkins-on-ubuntu-16-04

Monday, October 9, 2017

delete AWS snapshots older than 30 days





#!/bin/sh
source /etc/profile
TODAYINSEC=$(gdate  +%s)
DATTOCOMPARE=$(date -v-30d +%s)
date > SNAP_TO_KEEP.txt
date > SNAP_TO_DELETE.txt

echo "Collecting snapshot information"

while read az; do
  echo "this is az "$az
                while read owner; do
               echo "this is owner "$owner
            aws ec2 describe-snapshots --region $az --owner-ids $owner --output json > listofsnaps.txt
            cat listofsnaps.txt | egrep "StartTime|SnapshotId" | awk -F'"' '{print $4}'  | awk 'NR%2{printf "%s, ",$0;next;}1' > listofsnaps.txt_tmp
            echo "listofsnaps for " $az "and owner" $owner "is ready"
                            while read snap
                            do
                            echo "working on snap " $snap
                            raw_date=`echo $snap | cut -d, -f1`
                            snap_date=`gdate -d $raw_date +%s`
                            echo "Snap date is: " $snap_date
                            echo "Snap to compare is: " $DATTOCOMPARE
                                      if [ $DATTOCOMPARE -gt $snap_date ]
                                      then
                                         echo $snap | cut -d, -f2 >> SNAP_TO_DELETE.txt
                                         snapToDelete=`echo $snap | cut -d, -f2`
                                         echo "Deleting Snapshot: " $snapToDelete
                                         #aws ec2 delete-snapshot --region $az --snapshot-id $snapToDelete
                                         echo "aws ec2 delete-snapshot --region" $az "--snapshot-id" $snapToDelete
                                      else
                                         echo $snap | cut -d, -f2 >> SNAP_TO_KEEP.txt
                                      fi
                            done < listofsnaps.txt_tmp
              done <owner-list.txt
done <az-list.txt

Thursday, May 11, 2017

How to get all your AWS snapshots using one command and AWS CLI




In case there is a need to view all the available snapshots from all of your account, for all of the availability zone using one command, this is how to do it.

create the owner-list.txt file which contain all of your AWS accounts number, each account should be in a line, for example

# cat owner-list.txt
483426017123 
487214417321


then create az-list.txt file, with the following availability zones

# cat az-list.txt
us-east-1 
us-east-2 
us-west-1 
us-west-2 
ca-central-1 
eu-west-1
eu-central-1
eu-west-2 
ap-southeast-1 
ap-southeast-2 
ap-northeast-2 
ap-northeast-1 
ap-south-1 
sa-east-1


now create the script itself


#cat get-snapshot.sh
while read az; do  
echo $az 
while read owner; do 
echo $owner 
aws ec2 describe-snapshots --region $az --owner-ids $owner --output json > $owner.$az.json 
done <owner-list.txt 
done <az-list.txt


the output will be list of json files, for each AZ and account.




Sunday, November 27, 2016

MSSQL (SQL) on Ubuntu Linux - The easy way




Hi

follow the next steps to install MSSQL (SQL) on Ubuntu 16.04 LTS Linux machine.

NOTE: please make sure that the machine which you are about to install MSSQL server on it have got at least 4GB of RAM, otherwise installation will fail.


1. Import the public repository GPG keys:

# wget https://packages.microsoft.com/keys/microsoft.asc --no-check-certificate
apt-key add microsoft.asc

2. Add the Microsoft SQL Server to Ubuntu repository:

echo "deb [arch=amd64] http://packages.microsoft.com/ubuntu/16.04/mssql-server xenial main" > /etc/apt/sources.list.d/mssql-server.list

3. update the source repository and install MSSQL

#sudo apt-get update
#sudo apt-get install -y mssql-server

4. After the package installation finishes, run the configuration script:

# sudo /opt/mssql/bin/sqlservr-setup



5. Type "YES" to accept the license terms.

6. Enter a password for the system administrator (SA) account, then confirm the password for the system administrator (SA) account.
 Make sure to specify a strong password for the SA account (Minimum length 8 characters, including uppercase and lowercase letters, base 10 digits and/or non-alphanumeric symbols).


7. After setup completed successfully, start the MSSQL service:

#systemctl start mssql-server

8. You may check the service status by typing: 

#systemctl status mssql-server


now that the service is up and running, you may connect to the database and manage it using SQL Server Management Studio using the SA username \ password.



here it is, a MSSQL server running on Linux machine !