AWS Advanced Networking – Speciality Study Guide

I recently passed the AWS Certified Advanced Networking – Specialty exam and want to share my experiences with those of you working toward the certification.


The exam is 170 minutes with 65 questions and compared to the professional level exams, this felt like ample time leaving more than 45 minutes to review my (many) marked questions. The question length was manageable and felt less than the pro exams, which made it easier to consume the content and move through the exam at a good pace. The questions are a mix of scenario and Q&A with scenarios making up the larger proportion.

Going in with a good understanding of networking such as TCP/IP, subnetting, routing and data center structure will help a lot. I don’t have a ‘networking’ background so I took more time to prepare and students with little or no networking experience should consider spending some extra time studying network fundamentals.

In terms of the technology specifics, I have included a list of top topics the exam focuses on along with some tips and key information. The section is limited to the top 6 +/- areas to keep this article a consumable size.

Study Resources

1. The AWS Certified Advanced Networking Study Guide

You will not often find me recommending the official text books as a number one study resource simply because I often find them hard to consume and remember. I prefer consolidated information that I can remember on test day. This study guide is specifically focused on test takers and the authors have done a great job of structuring the information in to easily consumable sections, each with its own assessment test.

I recommend taking the video courses mentioned below first while having this book on hand as a reference. Once the video courses are completed, take the assessment test in the book’s introduction to give you an idea of your strengths and weaknesses. Use the output as a guide to further research. The practice exams included in the online study tools will also help highlight areas you need to brush up and these exams are by far the closest practice exams to the real test that I found – offering much more of a read world experience than the exams included with the video courses, for example.

The 138 flash cards included with the online content are also really useful – these are not the usual ways that I choose to study but I would not have passed without this resource so it is cash well spent.

2. Video courses from and Linux Academy

Both of these courses are highly valuable. For those of you that have read my previous article, you know that I’m a huge advocate of both training providers and they both deliver for this certification.

Derek Morgan’s course does a great job of breaking down the concepts – everything from the basics of IPv4 and subnetting through to BGP and MPLS specifics and processes. I found the breakdown between fundamentals and deep dive really helpful. The course really helped me understand the specifics and helped me understanding the ‘right’ choice on questions where you need to reason the answer.

Ryan Kroonenburg’s course is also outstanding. The course helped me ‘get it’ and fit everything together. I loved the flow and structure and focus on flow of each of the subject areas as this is something the exam really focuses on, BGP path selection and how to influence it etc.

3. Blogs and Articles

Yujun Liang’s article on LinkedIn:
Jady Liu’s article on LinkedIn:
Michael Kelly’s blog:

4. AWS Re:Invent Videos

Jady Liu’s list is all you need:

In summary, get the book to maximise your chances of passing with the best score.

Top Topics, Tips and Key Information

Notes covering what I feel are the top 6 focus areas for the exam.

Direct Connect (DX) and Border Gateway Protocol (BGP)

  • By far the most focused topic of the exam
  • DX allows you to connect your AWS resources to your on-premises resources privately
  • DX is typically more consistent and reliable than a normal internet connection
  • AWS provides 1Gbps or 10 Gbps Ethernet single mode fiber-optic
    • Sub 1Gbps connections can be ordered by a partner (min 50 Mbps)
  • Direct Connect locations allow you to connect to that specific region
  • Supports both IPv4 and IPv6
  • Reduced data-out rates. Data in to AWS is free (in almost all cases)
  • Virtual interface (VIF) needed for each VPC. Connected to Direct Connect
    • Public VIF: Used to connect to AWS resources not in a VPC
      • Used for a VPN to a VGW
    • Private VIF: Used to connect to resources within a VPC
  • One LOA-CFA per connection per data centre
    • LOA-CFA = Letter of Authorization Customer Facility Access
  • LAGs = Link Aggregation Groups
  • 100 BGP prefixs can be announced over a single private VIF (hard limit)
  • S3 endpoint cannot be accessed over DX
    • Public VIF used to access S3 over a direct connect (but not the endpoint)

DX Requirements

  • BGP
  • BGP MD5 auth
  • Single mode fibre 1000BASE-LX and 10GBASE- with 802.1q VLANs
  • Auto-negotiation for the port for direct connect needs to be disabled
  • You cannot change the port speed of an existing connection
  • Limit on BGP (dynamic) advertised routes per route table is 100
    • Static route limit is 50 (convert to dynamic/BGP to increase amount of possible routes)
  • Lowest bandwidth on DX partners is 50 Mbps

Elastic Load Balancers

  • Allows you to distribute application traffic across multiple EC2 instances
  • Can distribute traffic over multiple availability zones
    • Cross-zone load balancing MUST be enabled
  • Two network configurations of ELBs:
    • External: Routes traffic from the internet to EC2 instances
    • Internal: Routes traffic from internal resources to EC2 instances
  • Minimum IPv4 subnet size of /27, which differs from the VPC
  • Cannot use AWAS WAF with ELB classic
  • Terminate SSL on ELB for performance – be aware of requirements for end-to-end encryption
  • x-forwarded-for header needed to see client in access logs – ALB
  • Proxy protocol to enable connection information (including client IP) when using TCP or SSL for both front and back end on ELB Classic
  • Use alias Route53 record

Virtual Private Networks (VPN)

  • Site-to-site only using AWS VPN
  • Client-to-site would be third party software running on EC2 in a VPC
  • IPSec and Encapsulating Security Protocol
  • IP protocol 50, port 500 UDP for IPSec
  • Benefits:
    • Data encryption in transit across the internet and direct connect
    • Used to encrypt direct connect (use Public VIF for VPN termination)
  • Use monitoring software (keep alive) to keep tunnel up
  • Routing hard limit of 50 for static routes and 100 for dynamic routes (BGP)
  • VPN connection consists of two tunnels (configure to a single customer router for HA on the AWS end)
  • HA on the customer end requires two VPN connection (each provides two tunnels for mesh HA)


  • Route 53 is Amazon’s DNS service
  • Allows registration of domain names or use of domain names you own
  • Utilises health checks to monitor health of your instances
  • Public or private hosted zones
    • A public hosted zone is accessible from the internet
    • A hosted zone is named after a domain name that you own
    • A private hosted zone can be any domain you wish as it does not traverse the public internet
  • A reusable delegation set can be used to create a set of name servers to use for multiple domains
  • Record specific information:
    • CNAME: Not free for queries, points to records hosted anywhere
    • ALIAS: Not charge for queries, AWS resources online
  • To ensure name servers remain consistent across domains create a Reusable Delegation Set (through the CLI or API)

Elastic Network Interfaces (ENI)

  • You can associate multiple IPs to each network interface
    • Beware of instance specific limitations
  • An ENI can have IPv6 addresses if the VPC has IPv6 enabled
  • ENI can be moved between subnets but not AZs
    • Can be a good way of migrating network configurations where required
  • Attaching two ENIs to the same instance in the same subnet can cause networking issues
    • Use multiple IPs on the primary NIC, if required


The exam has a reputation for being the most difficult of the AWS certifications and it necessitates a good understanding of general networking with specific focuses on connectivity, routing, performance and troubleshooting. I managed to pass on first attempt with a score of 75%, which is not my highest score and demonstrates the challenge especially considering I spent more time preparing than I did for each of the SA and DevOps certifications.

I personally really enjoyed the experience and have learned a lot of practical and usable skills and experience that will help me succeed professionally. I hope this article has been useful, good luck with the exam!

Passing the GCP Data Engineer exam

In this article I share my experience studying for and passing the Google Cloud Certified – Professional Data Engineer exam.

Intro and Exam Summary

In in the feedback to the Cloud Architect article, you asked for more technical detail, so I have adapted the style of this article to include more focused guidance. To try and keep the content concise, I have used bullet points where possible in a ‘top 5’ style for each topic.

The Data Engineer certification is one of the toughest exams but most enjoyable learning experiences.  To help bridge the gaps in my Big Data experience, I signed up with Qwiklabs and found the scenarios to be really helpful.  Codelabs offer similar challenges that can be used in conjunction with your free tier account as a cost-free alternative.

Case studies didn’t feature as much as in the Cloud Architect exam. Machine Learning was the most heavily featured topic so pay particular attention to ML in your preparation.

Other exam topics to be aware of include the Apache Hadoop ecosystem, make sure you’re familiar with Hive, Pig, Spark and MapReduce, how to migrate from HDFS to Google Cloud (Cloud Storage). Dataproc is the managed service for the Hadoop ecosystem – map scenarios using existing Hadoop workloads to Dataproc.

Don’t forget exam strategy – narrow down to the least wrong answer for questions you’re unsure of and mark items for review rather than spending too much time contemplating. Time wasn’t too much of an issue though and I had 10 minutes on the clock when I hit the submit button.

Case Studies


  • Existing workloads
  • Map solutions to Pub/Sub > Dataproc (Dataproc as they have an existing Hadoop solution)
  • Storage = Cloud Storage, Bigtable, BigQuery


  • Greenfield / no existing infra (Google recommends Dataflow over Dataproc for greenfield data processing)
  • Long-term data storage in Cloud Storage or BigQuery (depending on the question e.g. analytics)
  • Grant access to data = IAM roles


Data Engineering Concepts

Think of data as a tangible object to be collected, stored, and processed. The life cycle from initial collection to final visualisation. You need to be familiar with the steps, what GCP services should be used and how they link together.

Recommended reading:

Database types

Exam expectations for database types

  • Understand descriptions between database types
  • Know which database version matches which description
  • Example: Need database with high throughput, ACID compliance not necessary, choose three possible options

What is streaming data?

  • ‘Unbounded’ data
    • Batch data is ‘Bounded’ data
    • These terms are likely to be on the exam
  • Always flowing, never completes, infinite
  • Examples: Traffic or weather sensors

Machine Learning

Is the process of combining inputs to produce useful predictions on never-before-seen data. Essentially the process of a machine learning from one or more datasets to make predictions on future data.  TensorFlow is an open source library for numerical computation that makes machine learning faster and easier.

Machine learning and Tensorflow are featured heavily on the exam and I recommend reading up on these topics.  Make sure you understand the difference between supervised (regression and classification) and unsupervised (clustering) learning, hyperparameter tuning, feature engineering, underfitting and overfitting.  I recommend the Linux Academy course and Qwiklabs for conceptual and hands on experience respectively.

Recommended read:


Comes up in relation to random services on the exam. Research use of service accounts as well as the predefined roles for each service.

Resources that helped me prepare for the exam

Online self-paced training:

  • Linux Academy – Matthew Ulasien’s course on Linux Academy is great
  • Qwiklabs – Absolute must for me. Without the hands on experience from running through these labs, I wouldn’t have passed the exam


Product Specific Guidance

Cloud SQL

  • Managed MySQL/PostgreSQL database service (not NoOps)
  • Runs on top of Compute Engine
  • Red replicas are restricted to the same region
  • Disk size limited to 10TB

Cloud Spanner

  • Fully managed, highly scalable/available, relational database
  • Google describe Spanner as NewSQL (there are adaptations compared to other relational databases)
  • Consider for higher workloads than Cloud SQL can support when considering adaptions
  • Horizontal scale with strong replication consistency
  • Research interlative tables

Cloud Datastore

  • Fully managed NoOps NoSQL database
  • Highly scalable with automatic scaling and sharding and multi-region capability
  • ACID transactions
  • Single Datastore database per project
  • Research exploding indexes

Cloud Bigtable

  • High performance, massively scalable NoSQL database
  • Recommended minimum data size of 1TB (for cost effectiveness)
  • Not a NoOps solution (instance management is a factor)
  • High throughput analytics and huge datasets
  • Research row key and column structure and how to avoid hotspotting

Recommended reads:


  • Fully managed (NoOps) data warehouse (think analytics when thinking BigQuery)
  • Autoscaling to petabyte range datasets
  • Use case for store and analyse
  • Query with standard and legacy SQL
  • Extremely fast read performance, poor write (update) performance

Remember colours of query performance data for the exam (you may get a question that doesn’t specify purple is read for example):

BQ featured heavily on the exam as well as query optimisations such as partitioned tables, views and caching.

Cloud Pub/Sub

  • Global-scale messaging buffer/coupler based on Apache Kafka
  • Guaranteed at-least-once delivery
  • Pub/Sub does not guarantee messages will be delivered in order
  • Push subscribers must be Webhook endpoints that accept POST over HTTPS – default is pull

Cloud Dataproc

  • Managed service for the Hadoop ecosystem (customers migrating existing workloads)
  • Managed, but not no-ops – configure cluster, not auto-scaling
  • HDFS migrate to Cloud Storage
  • Can only change number workers/preemptible instances (cannot change instance type on existing clusters)
  • Use pre-emptible VMs for cost saving

Cloud Dataflow

  • Google recommended solution (over Dataproc) for data processing of greenfield workloads built on Apache Beam
  • Truly NoOps data processing for streaming and batch workloads
  • Use with Cloud ML for machine learning (not Spark ML which would map to Dataproc)
  • Know windows, watermarks, triggers, max workers, pcollections,
  • When to use windows (global, fixed, sliding) tripped me up on exam day

Recommended reads:

Cloud ML Engine

  • Fully managed Tensorflow platform
  • Scales to multiple CPU/GPU/TPU workloads
  • Automate the platform elements of machine learning
  • Currently only runs Tensorflow


  • Partnered with Trifacta for data cleaning/processing service
  • Supported file types:
    • Input – CSV, JSON (including nested), Plain text, Excel, LOG, TSV, and Avro
  • Output – CSV, JSON, Avro, BigQuery table
    • CSV/JSON can be compressed or uncompressed

Data Studio

  • Easy to use data visualization and dashboards
  • Part Of G Suite, not Google Cloud
  • Two types of cache – query and pre-fetch
    • Exam questions around caching of data resulting in issues pulling data from BigQuery etc
  • Free to use!


  • Interactive tool for exploring and visualizing data in Notebook format
  • Built on Jupyter (formerly iPython)
  • Visual analysis of data in BigQuery, ML Engine, Compute Engine, Cloud Storage, and Stackdriver
  • Underpinned by Compute Engine (Datalab launches on an instance)

I wish you good luck with the exam. I hope this write up helps with your preparations. As always, get in touch if you would like any more specific advice or to talk tech in general!

Passing the Google Cloud Certified – Professional Cloud Architect exam

Having recently passed the Google Cloud Certified – Professional Cloud Architect exam, I want to share my experience with you along with some relevant resources and content that helped me along the way.


Get Google certified! The Cloud Architect exam is tough but this article provides some tips, tricks and resources to help you succeed. The two main things to be aware of for the exam are the increased focus on big data (when compared to other cloud architect exams I have taken) and the subtle wording in the questions that needs to be considered to choose the right answer out of more than one technically suitable solution. If you have the time, read on.

Why should you take the Professional Cloud Architect exam?

Aside from the fact that it is a great personal development opportunity that comes with prestige and demonstrates in-demand knowledge of one of the fastest growing public cloud platforms, I think you will find Google Cloud Platform a joy to use and learn. The use of projects in an organisation, for example, is a neat feature that helps organise resources, delegate access permissions and control costs.

The Professional Cloud Architect certification demonstrates that you have a thorough understanding of cloud architecture and GCP, that you can design and manage robust, secure, scalable and highly available solutions. Sounds cool, right?!

The exam covers all the main topics from networking, identity and access management (IAM) through to big data services and concepts such as the software development lifecycle. It focuses on the most appropriate solution for a given scenario such as ‘the most cost effective’ or ‘the least downtime.’ Some of this may sound daunting but don’t worry! I have you covered with training and preparation recommendations.

If all that isn’t enough, when you pass the exam you get the choice of a free hoodie, sweater or backpack from the Google certified online store!

What are the exam requirements?

Per the official exam guide:

  • Design and plan a cloud solution architecture
  • Manage and provision the cloud solution infrastructure
  • Design for security and compliance
  • Analyse and optimize technical and business processes
  • Manage implementations of cloud architecture
  • Ensure solution and operations reliability

How is the exam in practice?

Tricky to say the least! The exam did a good job of testing all topics from the official guide. There was a focus on big data topics in my exam, which is one of my weaker subjects. Thank you to Matthew Ulasien and Linux Academy for the great course that prepared me with enough knowledge to get through these sections.

Make sure you read through the case studies ahead of time. I found that already being familiar with the cases made me much more comfortable when answering those questions and inevitably saved valuable time. There are four possible case studies and I experienced questions from three of them. They are published by Google here:

Tip: TerramEarth is a beast!

For those of you who have taken a AWS Professional Level certification – one of the stress factors for the exam is that time is tight. The GCP exam is a smaller allotment of questions at 50 and I found 2 hours left me with 15 minutes at the end to go over marked questions.

Be aware of the wording such as “the most cost effective” or “next year they plan to” as these remarks shape the best answer for a given scenario out of more than one technically correct answer.

Resources that helped me prepare for the exam

Online self-paced training:

  • Linux Academy – Matthew Ulasien’s 3 part course on Linux Academy was my ‘go to’ resource in preparing for the exam, it covers everything you need to know from cloud computing basics through to the more advanced GCP topics
  • Coursera – The GCP for AWS Professionals course is a good high-level course to get AWS professionals familiar with GCP – for me it gave an overview but not comprehensive enough for the exam and not required if taking all 3 parts of  the Linux Academy course

Documentation and other reference resources:

  • GCP in 4 words or less – great for getting a 4 word or less description of all GCP services
  • GCP Solutions – Google resource detailing the right solutions to help you solve business challenges
  • – Google maintained resource containing lots of useful flow charts and diagrams

Other blogs (that also have lots of useful resources linked):

Exam techniques and preparation

Remembering two key things always helps me answer the trickiest questions on multiple choice exams. Narrowing down to the least wrong answer for the questions I really feel I don’t know the ‘right’ answer. Reading and re-reading questions so weed out the nuances as mentioned earlier in this article. Make sure you take note of the “most cost effective” type questions as that style is particularly evident in this exam.

Some questions to ask yourself before taking the exam:

  • Can you explain how to build a flow for an ETL data set and use the correct GCP services? How does that differ for streaming and batch jobs?
  • Could you have a conversation with another GCP architect about the various different services and explain their use case?
  • Can you design an auto-scaling and fault tolerant solution using the different compute services that GCP offers? App engine, compute engine, container engine?
  • Do you know how to migrate data from an on premises data centre to GCP? How about from another public cloud provider?

I wish you good luck with the exam. I hope this write up helps with your preparations. As always, get in touch if you would like any more specific advice or to talk tech in general!


Tips for studying while raising a family

Being keen to start earning after college, I opted to directly in to employment rather than university. The decision was helped by a good job offer, which was the sort of opportunity I would have hoped to gain with a university degree – an IT generalist position with the Derby Telegraph. It was a good career choice but my ambitions for further study remained so after a putting it aside in my twenties, I enrolled on the Computer Science pathway of The Open University’s BSc Computing & IT degree.

When I started studying, I never dreamt it would see me through three house moves, having two children and getting married. Studying at the same time as being a full-time employed father and husband was (and continues to be) hard. Distance learning both helps and adds to this challenge. The ability to study any time any where is one of the most helpful things but the study schedule and assignment deadlines remain.

There have been moments of anxiety along the way, and times I’ve sat gazing at a textbook too tired to absorb any information and thinking how silly I was to be attempting to do this. Sleepless nights with young sons and busy days juggling a full time demanding job with bedtime routines adds to the fun! Returning to late-night study sessions after this is a challenge, but these have been the harder moments and are survivable, they have taught me better ways of planning and new study techniques. The course is really enjoyable and interesting and has some pressures alleviated by working in an industry closely aligned to the subject. Working towards a qualification is doable, enjoyable (most of the time!) and very rewarding!

Here are my tips on juggling studying with a full-time career whilst raising a family…

Include your significant other. Taking on a degree won’t always feel like a part-time and will be equally challenging for your significant other. Late nights, early mornings and time out of weekends (or whatever combination works for you) will be challenging. For me, this has meant missing time with my family on Sunday and sacrificing waking up together in the week as I wake up early to study. Communication here is key – and it is important to have the support of your significant other when making study plans.

Plan your studies. Look at the course schedule, add assignment and exam dates to your calendar, and make sure you set aside ample time to prepare for each milestone. [Important] Read the module guides as they include a plethora of useful information – including what is required to get the best marks! This helped me to plan where to spend the most time studying and writing assignments.

Take thorough notes. The stop/start nature of studying whilst raising a family makes it much harder for me to absorb the content. So, I find, taking thorough notes vastly helped me refresh and pick up where I left off. You will find it particularly helpful, and speedier, when it comes to exams and end-of-module assignments – the early modules will seem like a very long time ago!

Find the study time that works best for you. I’m a morning person so getting up before 5am to get a couple of hours of study before the kids wake up works best for me. There are always those times when an assignment is taking longer than anticipated, or you’ve fallen behind due to sickness or other issues and you must suck-it-up and work any spare hour to get things completed but these times. These occurrences should be kept to a minimum to reduce stress and gain the best from your studies and, for that, I recommend finding the time that works best for you. The Open University’s Brainwave application can help you here, it builds a profile of your performance across the day through five quick and fun games.

Study (and take notes!) on the go. While I’m home based, I often find myself out and about travelling for work and pleasure. All those times sat in an airport lounge, on a train or the back seat of a cab should be put to good use. The Open University offer most of their materials in one electronic format or another (namely PDF) so can be downloaded to your smartphone, tablet or e-reader. Don’t forget your favorite note taking applications (mine is OneNote) or notepad and pen so you don’t miss out on vital notes.

Connect with others. The Open University have student groups associated with each module and I highly recommend using them fully. This not only makes studying for that module much easier through shared understanding and support, but it is a great opportunity to expand your professional network. Personally, I have been lucky enough to benefit from having mostly local study groups and classes which has allowed me to meet up classmates outside of the schedules study classes. Fellow students on more dispersed modules have told me of Facebook groups and Google hangouts in addition to The Open University’s comprehensive forum and Adobe Classroom facilities.

Take a break. The Open University, and many other distant learning providers, pride themselves on being flexible. I can tell you from personal experience that is definitely the case. I’m currently half way through my degree and I’ve taken other shorter courses with the OU – first starting in 2013. I’m currently taking a year off to enjoy the birth of our second child and will resume studies in the autumn. Breaks can also mean needing more time within a module – an assignment extension for instance – and the key here is to remain in contact with your tutor and student support team. The important thing to note here is that end of module assignments and exam dates are generally fixed.

I chose to study with The Open University and have found it to be a great studying experience backed by knowledgeable tutors and flexible delivery to suit my changing requirements; it is hard work and a decision that should be taken carefully, but if you want to progress or change careers, it enables those who are unable to commit to full or part-time brick university the opportunity to study flexibly.

Thank you for reading. I hope these tips help, and please feel free to reach out to me directly or add any of your own tips and experiences in the comments below.


Architecting Microsoft Azure Solutions 70-534 Exam Tips

I recently took and passed Architecting Microsoft Azure Solutions (70-534) and earned Microsoft’s MCSE Cloud Platform and Infrastructure. In this post I have summarised my experience to help you be successful.

I used PluralSight’s 70-534 video training course taught by Orin Thomas to prepare for the exam. The course if very good but be prepared for some detail heavy slides, screenshot and your favourite notes tool will help! Having an AWS and Rackspace Cloud background, I found making the logical switch to Azure was like driving a new car – switching the window wipers when trying to indicate. Everything is similar but not quite the same. It took me a couple of days to get used to so I advise building some time in to your studies if you are making the switch from another platform.

UPDATE: On 20/02/2017 Linux Academy released their 70-534 course, which I have reviewed and highly recommend. It follows their usual high standard with hands on labs and great documentation.

The exam is structured in a familiar Microsoft format with a mixture of multiple choice, drag and drop (what is the correct order of doing x or y from a list of options), drop down selections (complete this code snipper by choosing the correct commands etc.) and scenario based questions with a time allotment of 2 hours 30 minutes. I finished with about 45 minutes remaining. There were 65 questions that started with a scenario, was followed by 35-40 standalone questions (a mix of the afore mentioned styles) and completed with two more scenarios (total of 3 scenarios). The scenarios ranged from 6 to 10 questions and are made up of a company use case with high-level requirements, business requirements, technical requirements and some are accompanied by a network diagram. I personally find these the hardest questions to answer because they can be wordy and the answer to any questions can come from one or more sections – make sure you take the time to read and understand the scenario.

Thoughts and guidance:

  • When you sign up for the exam I recommend taking the option to include access to the Measure Up practice test – great for getting used to the exam structure and content.
  • Understand the limitations and implementation process (and order) of ExpressRoute
  • Understand VPNs and their implementation method (tip: what services require a point-to-site VPN? – this one didn’t come naturally to me)
  • When learning about authentication, pay particular attention to the order to get things configured (federation, oauth, AD etc), I got a lot of ‘place this in order’ type questions relating to authentication
  • Understand the different queues and when to use each
  • When is DPM used, when is it not used?
  • Get familiar with the CLI and PowerShell – prepare for ‘complete this code snippet’ questions

I highly recommend reading all of the official exam guidance from Microsoft as this gives a good feel for the key focus areas. Also beware that the exam was updated in Q4 of 2016 which leaves some training sources lagging slightly behind with their content. The training mentioned in this article are up to date.

In summary, the exam is difficult and covers a very wide range of topics but I personally found that there was ample time to read through and answer the questions without having to rush. There was also time to go back and review any marked questions. Beware with marking for review, questions have to be reviewed and submitted at the end of each section before moving on and once a section has been submitted, it is final. Also beware there are questions in sequence that form the basis of one overall answer and these questions cannot be revisited or marked for review.

Good Luck and, as always, if there is anything I can do to assist – get in touch!

Some Useful Links:

Linux Academy’s course:

PluralSight’s course:


Microsoft’s Cloud Design Patterns:

AWS Professional Certification Guide

Following on from the earlier post covering my experiences taking the AWS associate level certifications, this post covers preparing for and taking the AWS Professional level certifications.

Given that I started studying for these certifications coming up to the Christmas break and there were no suitable exam slots until late January, I decided to study for both the SA and DevOps pro, before sitting either exam. I found it to be a good decision given that there is a good amount of content overlap and I felt more confident going in to the first pro-level exam. The task was daunting – 6 weeks to preparation time for both exams and towards the end, I felt that pace was pushing too hard (if repeated I would have give myself at least 6 weeks per exam). Overall preparation time is relative to the amount of previous experience you have in the subject area and the amount of time you can invest in a given period. I managed an intense amount of study per day which was made easier by a combination of public holidays and planned time off from my day job.

It is a huge relief to be successfully on the better side of the pro-level certifications, something that was far from certain going in to the process. My preparation focused on the same techniques as during the associate level . Firstly, video training from Linux Academy and, secondly, AWS documentation (some of the industry’s best documentation) and finally (most importantly) hands on practice with AWS (lots of this at the pro-level along with some real world experience if possible).

AWS Solution Architect – Professional

As mentioned before, I feel the SA certification has the most wide ranging content which makes for the most daunting preparation. That being said, I didn’t find it the most difficult of the two. Linux Academy is the most comprehensive video training and offers a high standard of content backed by labs and additional tailored documentation, which made preparation that much easier than it would have been otherwise. I also took the course, which is great at focusing on the exam specifics (take both courses if possible).

One of the key things to be aware of when it comes to the exam is the time available, it’s tight! There are approx. 80 questions (I had 77) which have to be completed in 2 hours 50 minutes (170 minutes). The questions are wordy and have multiple theoretically correct answers, the key is to look for what the question is looking for in terms of technologies used and best practices. Reading the questions and deciding on the right answer took me a surprisingly long time. Rather than trying to keep an eye on the number of questions answered vs. amount of time remaining, I set myself a 2 minute rule for each question  and for the questions that took too longer – I gave my first/best guess.

My thoughts on preparing for and taking the exam:

  • Pay particular attention to ElasticBeanstalk and OpsWorks. When do they work together? What are the different deployment types? How do you deploy and rollback? What languages are supported?
  • Get familiar with EC2 instance types. There are a lot of EC2 design related questions and knowing which instance to use in which scenario is essential.
  • Understand connectivity, how each type is setup, how it works, routing, propagation (VPNs, VPCs, VPC Peering, DirectConnect)
  • Understand how to optimise EC2 storage performance, when different instance types are beneficial and how to optimise EBS performance.
  • Do you know when to use different caching engines? When would you choose Memcache and when might you prefer Redis?
  • How do you loose couple services? Do you know how, when, and why to use SQS and SNS? Do you know the limitations and when it is not appropriate to use one or the other of these services?
  • Understand when and how to use AssumeRole,  AssumeRoleWithSAML and AssumeRoleWithWebIdentity.
  • Make sure you understand consolidated billing, how to set it up and what it offers. I got a few easy points questions on this topic.
  • All of the training material, documentation and hands on practice is important and this list isn’t an overall guide to the exam but some pointers that would have been useful to know prior to taking my exam.
  • I found the official practice test for the SA pro exam to be highly misleading, poorly worded and generally I felt that it did me more harm than good when preparing for this certification. AWS really need to get the practice test updated. I recommend using the tests provided by Linux Academy.

AWS DevOps Engineer – Professional

The DevOps experience was the most varied for me. I found studying for the exam to be one of the most enjoyable experiences and learned a lot about OpsWorks, CloudWatch, AutoScaling (lifecycle hooks, self healing) and the various APIs however, the exam was by far the toughest of all. Contrary to some of the articles/blogs that I’d read before sitting the exam, I found it was by far the most challenging on time. I had exactly 80 questions to cover in 2 hours 50 minutes. The questions felt as wordy as the SA pro exam and took me longer to answer – often exceeding my 2 minute rule. The situation got so bad by question 35 that I had to skim read and answer the next 5/6 questions to catch up on time. Definitely be weary of time with the DevOps exam!

In terms of my preparation, both Linux Academy and were equally valuable. The key for me, even more so than in all of the certifications, was the AWS documentation and hands on practice – it was a huge help to develop my understanding.  I recommend researching the topics in depth, I naturally found myself doing this more than in the other certifications because I felt weaker on some of the subject areas and really wanted to familiarise myself with the CLI, API and SDK’s.

My thoughts on preparing for and taking the exam:

  • CloudFormation is one of the main topics in this exam. Understand template structure, intrinsic functions, WaitConditions, Helper Scripts, Stack, Update and Deletion Policies – I strongly recommend hands on familiarity along with theory.
  • OpsWorks and ElasticBeanstalk are also covered in detail. Understand OpsWorks auto healing, stacks, layers, lifecycle events, instances, and EB ebextensions, use cases, SDKs, supported languages.
  • As in the SysOps certification, DevOps builds on CloudWatch so learn about metrics, logging and monitoring.
  • AutoScaling is also covered at an advanced level. Learn about lifecycle hooks, termination policies and API, CLI and SDK calls etc.
  • Deployment strategies. Blue/Green or A/B, All at once, Immutable, Rolling.
  • The official practice exam for the DevOps pro exam is much better than the SA pro and is relevant for exam prep.
  • In summary, study the theory but there is serious benefit in putting this in to action with hands on practice.


The AWS Professional level certifications are a big step up from the associate level certifications and do an excellent job of testing true understanding and hands on abilities. The time aspect is one of the most challenging aspects of both exams but particularly the DevOps exam. If choosing to the take the practice exams be weary of the SA pro exam but I do recommend the DevOps pro practice exam. The practice exams from Linux Academy are by far the best that I found during my pro-level studies.

Final thoughts:

  • Always read the exam blueprint and AWS exam guidance. This is easy to over look but provides great context, detail on expectations and generally gets you in to the mindset of what AWS are assessing with the certification exams.
  • Learn to read fast! What I mean here is concentrate on exam strategy and timing, when taking the Linux Academy practice exam, allow yourself no longer than 2 minutes per question and prepare yourself for the 2 hour 50 minutes of heavy read, consider and answer type scenario of the exam.
  • Practice and practice some more. At the associate level, you could get through with theory but I strongly believe that is not the case with at the professional level.

In addition to reading my blog, read blogs by Adrian Cantrill, Nick Triantafillou and Stephen Wilding (all linked here and below) which I found were a huge help during my preparation.

If you feel that I can help your studies in any way – get in touch! Good luck!



Official Exam Links

** Unlike with the associate certs, I won’t link specific documentation here as there is a huge amount of content and particular focus areas will depend on your current level skills and experience **

AWS Associate Level Certification Guide

I’m happy to report that I now hold all three AWS associate level certifications and have written up my experiences to help you on your AWS certification journey. This guide covers the exam topics, resources used to prepare and my experience on exam day. 

I began my certification journey taking an instructor led and remotely delivered (Webex) version of the official Architecting on AWS course, which is designed to prepare students for the AWS Solution Architect – Associate certification exam. The course was ran by QA and served as a great introduction to AWS certification. On reflection, the course contained all of the content and labs required to pass the exam however, wanting to be extra prepared for the first exam, I purchased’s all five certification bundle.

The three associate certifications provide a solid base and I found preparing for exams was manageable. I allowed up to two weeks to prepare for each and used a combination of, hands on practice with AWS using a free tier account and time spent reading the AWS documentation – particularly for topics that require more depth understanding.

AWS Solutions Architect – Associate 

The instructor led training gave me a good foundation, which gave me the confidence to book the exam immediately after taking the course – giving myself another week to review the content. I spent a lot of the time time running through the basics hands on – creating VPC’s, EC2 instances, Security Groups, NAT gateways, ACLs etc. using both the GUI and CLI.

I feel that the Solution Architect certification has the most wide ranging content, which made it the most daunting to prepare for (not helped by it being my first experience of AWS certification!) 

My thoughts and guidance after taking the exam:

  • Pay particular attention to VPC, IAM, Route 53 and S3
  • By all means, don’t miss any topics on the exam guide but I got the highest number of questions on the areas mentioned. Particularly:
    • Route 53 record types and appropriate usage (set these up, play, create health checks, understand the different record types etc.)
    • Process to create a VPC, difference between a NAT instance and a NAT gateway etc. (again, the best way to know this is to do it a few times)
    • When to use IAM roles, users, groups etc (tip always use Roles where possible, particularly for EC2 instances)
      • AWS recently enabled roles to be added to EC2 instances that are already online, this isn’t reflected in the exam yet
    • S3 storage types and appropriate use cases, difference between durability and availability (pay particular attention to the wording as the stats are different for each)
  • Learn how to calculate DynamoDB provisioned throughput
    • Tip: For reads the formula is (ITEM SIZE (rounded up to the next 4KB multiplier / 4KB) * # of items
    • Tip: For writes the formula is (ITEM SIZE (rounded up to the next 1KB multiplier / 1KB) * # of items
    • I personally got 1 or 2 of these in the SA exam but more in the Developer exam (more on that later)
  • If you only read one piece of documentation, make it the ‘AWS Well-Architected Framework’ whitepaper (link below). This document introduces the five pillars of the well-architected framework and will help develop your approach to architecting solutions and will greatly help (exam strategy here) with eliminating the obviously incorrect questions on the certification exams

AWS Developer – Associate

Many exam takers, blogs and even training providers indicate that the developer associate certification is the easiest of the associate level certifications however, I was quite the opposite and found the developer exam the hardest of the first three. For those of you that have already taken this exam or do in the future – I am interested to hear your experiences.

To prepare for the exam I gave myself two weeks and used as my primary training source along with hands on practice using my free tier account and a more than usual amount of time reading through the AWS documentation, which I found most important for the developer preparation. I’d recommend spinning up an Amazon Linux EC2 instance, getting the AWS CLI setup and interacting with things like S3 using roles / credentials / keys and understanding the differences.

My thoughts and guidance after taking the exam:

  • Know how to interact with the AWS CLI and API particularly common commands for interacting with S3
  • Learn about Simple Notification Service (SNS) particularly the different name/value pairs available in the message body and different notification options
  • How do you approach security in AWS (dev focus)? Important to know IAM roles, access keys, policies etc, S3 encryption, Security Token Service (at a high-level, this is covered in more detail at the professional level), VPC security (security groups (stateful), ACLs (stateless) etc.)
  • I personally had at least 4 DynamoDB provisioned throughput related questions – some easy marks to be gained here (see above)
  • What are S3’s different use cases? Particularly the different URL types for websites vs other objects as well as bucket versioning
  • What are the different deployment types? When and how to use CloudFormation, Elastic Beanstalk etc. and what can and can’t these services do? The focus was more on EB and CF in the developer exam for me (less on OpsWorks, where I saw more questions in the sysops and professional certification exams although this may not be the case for all)

AWS SysOps Administrator – Associate

Last but by no means least, the sysops admin certification. Out of the three so far, I was most nervous about the sysops exam as it is commonly believed to be the most difficult of the associate level certification exams but this didn’t turn out to be the case for me. I actually found that there was a lot of overlapping content (and concepts) from the Solution Architect and Developer certifications allowing me to score highest out of the first three.

Once again, I gave myself two weeks to prepare for the exam and used as the primary training material along with my free tier account and reading through the AWS documentation. The first thing that comes to my mind when I think of the sysops exam is CloudWatch, know how to use it, how to setup metrics and custom metrics and how different AWS services interact with CloudWatch. 

My thoughts and guidance after taking the exam:

  • Understand monitoring and healthchecks, monitoring EBS, RDS, ELB, EC2 etc.
  • What is consolidated billing is and how do you set it up? (I got a couple of questions here, easy marks)
  • How do you make a solution elastic and scalable? RDS read replicas, auto scalaing, HA for single hosts (auto scaling with min 1, max 1) etc.
  • Backup options within AWS? Snapshots, storing log files etc.
  • How do you build IAM policies and use MFA and what are the compliance options?
  • How are networks scalable? Particularly focus on Route53 (weighted, latency based, geolocation etc.), creating and scaling NAT instances, how to enable VPC flow logs etc.


Studying for the associate level exams was enjoyable and gave me a great insight in to the AWS world as well as preparing me with the necessary knowledge and skills not only to pass the certification exams but to be effective with AWS. At the associate level, I personally found that there was enough time in the exam to work through the questions without having to rush and that there was enough time remaining at the end to review any marked questions.

General thoughts on taking the exams:

  • Always read the exam blueprint (linked from the official AWS certification pages) as this document gives you complete list of items to study and helps identify areas of personal strength and weakness
  • At the associate level, is a great resource however, only toward the end did I discover the Linux Academy video series, which are excellent. Where I found that was great at preparing students with the knowledge required to pass the exam, I personally found that Linux Academy is much better at preparing students with the skills and knowledge not only to pass the exam but to really understand the topics and learn the necessary skills to be effective beyond the exam. I found Linux Academy essential at the professional level (more on that later)
  • Read the sample questions (also from the official AWS pages) – I personally found that in at least two of my exams, one of these questions popped up (word for word)
  • TIP: Go to the AWS Japanese site for the Solution Architect sample questions and the PDF has the sample question answers in the bottom right of each question
  • I personally opted not to do the official practice exams at associate level so I can’t comment on those
  • Read blogs (linked below and others) – I personally found that reading about the experiences of others helped with my preparation

Finally, if there is anything that I can do to help, any insights, examples, areas you would like to discuss then please do get in touch. Good Luck!



Video Training:


Official Certification Pages:

Most important AWS Documentation (from my experience):