Azure Heavy Hitter Updates

Keeping up with Azure can be a full time task in itself with the plethora of updates. With this in mind, I thought I would share a couple of updates, which in my opinion are heavy hitters.

Account Failover for Azure Storage

Many of us use GRS storage for an added safety net, to ensure that data is available in a secondary paired region if the primary region has an outage. The kicker has always been that no SLA exists for this, it’s down to Microsoft to decide when they declare the primary region out and provide access to the replicated data.

Well that is all about to change with the announcement of ‘Account Failover for Azure Storage‘. This means that you are now in control of failing data over to your secondary region.

A couple of points which are worth noting:

  1. Having data available is only a single layer, think about security, identity and access, networks, virtual machines, PaaS etc in your secondary region
  2. Upon failover the secondary storage account is LRS, you will need to manually change this to GRS-RA and replicate back to your original primary region

Adaptive Network Hardening in Azure Security Center

I really enjoy updating an Access Control List, said no one ever!

Defining Network Security Groups (NSG) takes time and effort, with engagement across multiple stakeholders to determine traffic flow or you spend your time buried deep inside Log Analytics.

Microsoft have announced the public preview of Adaptive Network Hardening in Azure Security Center, which learns traffic flows (using machine learning) and provides recommendations for internet facing virtual machines.

A couple of points which are worth noting:

  1. This should be enabled when virtual machines are deployed to reduce the risk of rogue traffic
  2. As it mentions on the tin, this is for internet facing VMs only. However I’m sure this may be updated in due course.

Thanks for reading, tune in for the next post.

Resizing Azure Virtual Machines

Azure-VM

I’m regularly asked two questions when it comes to Azure virtual machines which are:

  1. Can I resize a VM to give it more CPU and RAM?
  2. What is the impact on the VM?

We are used to daily operations using on-premises features such as ‘hot add’  which can increase a VM’s RAM, CPU and HDD capacity without downtime, but can we do the same in Azure?

Can I Resize an Azure VM?

The answer is yes you can within the same series of VM e.g. an ‘A’ to a larger or smaller ‘A’.

When it comes to resizing a VM between different series of VMs the answer is it depends whether the resize is to same hardware or different hardware e.g. a change in chipsets

Undertaking a resize operation is a simple procedure, select your VM and then from the blade select size.Resize VM01.PNG

Select your desired size and finally hit select.  You will then see the Notification ‘Resizing virtual machine’

Resize VM02

Whats the impact to the VM if I can resize it?

The typical impact to resizing a VM is a restart which can take up to five minutes for the end to end operation to complete.  Therefore I would suggest an outage window is used and a known good working backup before you start!

If you are resizing to a VM onto new hardware (e.g. change in chipset), then the VM will need to be powered off first before the resize operation can begin.

It’s worth noting that if you are resizing VM’s onto new hardware which are in an Availability Set, then all the VMs need to be powered off for the operation to begin.

Final Thoughts

Microsoft have clearly made strides to ensure that resizing a VM within Azure is smoother and easier than ever before.  However ensure that you plan for downtime and perhaps more importantly have a known good working backup before you start work resizing VMs.

Architecting Multi-Site HP Storage Solutions (HP0-J67) – Exam Experience

MASEI had been considering taking the ‘Architecting Multi-Site HP Storage Solutions – HP0-J67’ for quite some time after reviewing the exam objectives.

I had cut my teeth on P2000 MSA’s and StoreVirtuals and had spent quite a bit of ‘one to one’ time with StoreServ’s recently, so thought I was in a good position to tackle the exam.

Bart Heungens (HP Certified Instructor) over at bitcon.be had written a blog post entitled ‘Are you ready for being a Master ASE Storage Solution Architect?‘ which had the download link for the resource library for the exam.  This was my go to study material to prepare for Architecting Multi-Site HP Storage Solutions HP–J67.

My initial thoughts when I flicked over the resource library was yikes! How much information was their to digest, I was never going to be able to do that in a timely manner.  The good news was that I had previously read most of the P2000, StoreVirtual and 3PAR whitepapers when creating designs and installing the products.

So I decided to skim read the P2000, StoreVitual and 3PAR whitepapers, noting down information that I had forgotten or thought or could be relevant.  This was fairly difficult as newer versions of LeftHand OS and Inform OS are not covered, so I spent sometime going over their release notes and essentially forgetting that information for the exam.  In total I must have spent four hours brushing up on the P2000, StoreVirtual and 3PAR.

I felt pretty comfortable with the HP StoreEasy range, as I use these for products for most of the Veeam Backup & Replication designs I propose.

The main area which my knowledge lacked was the HP StoreOnce, literally I have never read up on them, used them or tried to position one with a customer.  So I knew this was my ‘achilles heel’.  But I wanted to be sensible as I knew that I wouldn’t be positioning them with customers so any information I learned for the exam, would be leaving my brain shortly after exiting the exam centre!

I took a pragmatic approach to this looking at items that I felt could appear in an exam (if you are an experienced exam veteran you know what I mean by this) items such as block size, StoreOnce deduplication sources and size limits on the models.  I spent around 6 hours on this reading the various whitepapers.

In total I spent around 10 hours preparing for the exam, not what I would recommend, but I was aiming for minimally qualified candidate!  Designing storage solutions is part of my job (not all of it), so I was happy with my approach for this exam.

A new PearsonVue testing centre had recently become available in Milton Keynes, which had car parking (paid) which was a plus.  The exam centre was bright, clean and the receptionist was welcoming.

Not sure why I wasn’t nervous before stepping into the exam centre, I guess this had something to do with the amount of preparation I had done.  I think that if I had invested more time, then I would have been concerned about the outcome.

The exam consisted of 60 questions which had to be completed in two hours.  Usual exam format was used with multiple choice and drag and drop questions.  I thought the questions where a fair reflection of the blueprint.

I finished in an hour and clicked the ‘end exam’ button, and I was pleased to see you have passed the ‘Architecting Multi-Site HP Storage Solutions – HP0-J67’ with a score of 62%.

The pass mark was 58%, so I left the exam centre with a smile on my face thinking ‘I have done the optimal amount of studying’ and am the definition on the ‘minimally qualified candidate’.

One of my goals for 2015 was to achieve the HP Master ASE accreditation, I’m pleased to say that gaining the HP0-J67 has enabled me to do this.

Value of VDI Assessments

Disclaimer: This is a copy of the post that I made for TechTarget recently.

The past eighteen months has seen huge investment by VMware within the EUC space, with the arrival of Sanjay Poonen and Horizon (with View) 6 which introduced application publishing in the Advanced edition.  Finally we had an emerging contender to the heavy weight Citrix XenApp.

With this investment from VMware, the past twelve months have seen an increased number of customers looking at virtualising desktops and applications.  The first part of the engagement process is to access whether or not a physical computer is a virtualisation candidate.  To do this we undertake a desktop assessment.

What Is a Desktop Assessment?

First of all, I want to define what is meant by desktop assessment?  From this blog post perspective it is a piece of centralised software that collates information from remote agent’s installed on end user devices which are perceived to be candidates for VDI.

There are plenty of tools on the market from providers such as:

So the question is what value do these assessments bring to a business that is contemplating a move towards VDI?

Different VDI Guest Operating System

The first question is are we staying with the same operating system or moving to a new one?

If you perform a VDI assessment on a desktop operating system which is going to be replaced with a newer version, what value are you really obtaining? Not a lot, the applications will most likely require updating to support the new OS and this in turn leads to different requirements for compute and storage requirements.

Same Operating System

If you are going to have the same operating system you will get more value from the desktop assessment.  However it’s worth bearing in mind that the results from the desktop assessment often over inflate your compute metrics for example:

  • Compute resources used by in guest Anti-Virus are likely to be offloaded to a host based alternative
  • Compute and storage resources for Windows updates will often be negated by VDI tools such as PVS, MSC and Linked Clones
  • Applications installed by the end user will most likely be removed from the ‘master image’
  • VDI ‘master image’ will be optimised with services, widgets and applications being disabled or uninstalled

This can be viewed as a good thing as you can often show a slightly higher consolidation ratio per physical host.

What about Peripherals?

This is where desktop assessments come into their own.  Most IT departments I have spoken to always say ‘yeah we know what applications and devices our users use’, yeah right!

Desktop assessments will inform you what Parallel, Serial and USB devices are connected to the user’s computer.  This gives you the visibility to determine whether a particular user’s device is appropriate for VDI.

What about Licensing?

Desktop assessments are good for capturing what applications are used by users and what devices have what software installed.  However they often fall down in a number of areas:

  • Application dependencies, to determine why you have five different versions of Java installed
  • Often look to see if an executable is launched not whether an application is used to read or edit a document which can have a huge effect on license cost
  • Application readiness and/or virtualisation assessment, will the application work on Operating System ‘x’ and is it capable of being virtualised?

Often this area is overlooked and requires a large effort from a separate workstream outside of the desktop assessment.  Use the information from any desktop assessment as a starting point.

Group Policy

Most desktop assessments rely on an in-guest agent on the end device to capture metrics and pass them back to a central collection repository.  So what happens when you are waiting for that agent to start? The answer is simple nothing, you miss collecting data on anything that happens prior to the agent starting.

When the agent does start, the metrics collected for login time or log off time can be skewed by group policy applied to the computer object.

Ask yourself the question how often is a new OU created for VDI deployments?

What about the storage?

We have already established that the in-guest agent doesn’t start until when the operating system is ready so we have missed boot metrics IOPS.

Desktop assessments have the ability to capture steady state information which is OK as long as there are no other bottlenecks skewing the provided information.  For example:

  • Is paging occurring which is causing disk I/O to increase?
  • Is the limiting factor the hard drive itself and if unleashed from a 7.2K SATA hard drive, what IOPS would be consumed?
  • Are Anti-Virus scans causing peaks in provided disk I/O information?

What is the value?

For me, the value in a desktop assessment for VDI is in the following items:

  • Enables you to take a ‘bird’s eye’ view of what users are virtualisation candidates when items such as peripherals are taken into consideration
  • Provide user classification into different classes for resource consumption e.g. low, medium and high
  • Enables you to determine concurrent login and logoffs which can help determine storage sizing requirements
  • Gives you an insight into what applications are used by users

Final Thoughts

The desktop assessment does have some value in the VDI world, it is not a panacea to provide you everything you need to know on your journey to VDI.

Do I use desktop assessments, yes is the answer.  However it should be mentioned with a limited use case.  Most of the value comes from a pilot and load testing with products such as LoginVSI to determine the density of users per host.

Pre-Sales v Post-Sales

Versus-ModeThis is a post that I have been meaning to do for a while now, infact since last year! It follows on from the topic I started in a previous blog post, ‘What’s in a Job Title‘ and also ‘What’s This Pre-Sales Thing All About?

So the question is who is better?  To answer this, I will go over a number of categories that are used during a customer engagement to determine the winner.

I will be using the following two job titles, one each for pre and post sales.

Solutions Architect – Assist sales people across a broad range of products and are subject matter experts in a particular field.  They help translate business needs into technical solutions.  Commonly Solutions Architect guide the customer to use a particular piece of software or technology to meet the business requirement.  Some Solutions Architects can Lead Architect a project if required.

Technical Architect – Are focused in a particular discipline and are often the subject matter experts in this area.  These are the people who are engaged to create the ‘low level designs’ in there area of expertise, such as networking, storage, Exchange, Active Directory, System Center, Windows Desktop, vSphere, View etc.

Disclaimer: This is from my experience in which pre-sales and post-sales roles are clearly separated.  Your own experience will naturally differ depending on the size of the environment you work in and your own skill set.

1. Initial Customer Engagement

This is when the sales person engages a consultant to understand the business requirements and then translate them into a technical proposal.

The consultant will most likely be pre-sales.  They will qualify the opportunity to determine if this is something that the company they work for should spend their time on.  Ultimately, even though the pre-sales person is seen as a ‘cost of sales’ they take on the responsibility of what opportunities to pursue.

Responsibility: Pre-Sales 10 Post-Sales 0

Overall Score: Pre-Sales 10 Post-Sales 0

2. Customer Meeting

The opportunity is qualified and a meeting is held which is attended by the pre-sales person.  The purpose of this is to understand the business requirements of the technical solution in terms of Availability, Manageability, Performance, Recoverability and Security.  Also they gather details on the existing environment along with any issues that the customer is experiencing.

At this stage, a number of factors come into play and I’m afraid these are all pre-sales.

  • Understand whom you need to engage with at the customer as what IT want isn’t always what the business needs!
  • Rapport building with the customer, I know it sounds corny, but they have to believe in your ability to deliver the goods/services you represent.
  • Soft skills, are you able to listen and put across your point to C level and or technical people?
  • Can you understand exactly what the business issue is and what the customer is asking you to solve?

Responsibility: Pre-Sales 10 Post-Sales 0

Overall Score: Pre-Sales 20 Post-Sales 0

3. Technical Proposal

The creation of a proposal to match the requirements gathered in the customer meeting.  This document dictates the hardware, software and professional services effort that will be used to deliver the solution.

The pre-sales person is responsible for putting together the proposal ensuring that everything is interoperable and supported in the proposed configuration.

The proposal should be validated by multiple post-sales individuals to ratify the proposed solution and confirm the professional services effort (normally ends up in a tug of war with post-sales wanting more and the sales person wanting less.  With pre-sales being the referee!).

The solution is then presented to the customer, usually by the pre-sales person.

Responsibility: Pre-Sales 8 Post-Sales 2

Overall Score: Pre-Sales 28 Post-Sales 2

4.  Customer Workshop

Depending on the size of the project which has been won will determine the number of workshops that will be held with the customer.  The initial workshop is usually to determine the ‘project definition’ and is attended by the Project Manager, Solution Architect, Technical Architects and customer.

The Solution Architect takes the lead and covers items such as whom the customer is, what they are trying to achieve, the overall vision for the solution detailing Availability, Manageability, Performance, Recoverability and Security requirements along with existing infrastructure.  It’s important to note that the post-sales people who reviewed the proposal are not usually the same ones in the workshops.

The Technical Architects will then lead their own workshops based around their subject area such as network, storage, anti virus, backups etc.

Responsibility: Pre-Sales 5 Post-Sales 5

Overall Score: Pre-Sales 33 Post-Sales 7

5. Low Level Designs

Each Technical Architect will create a low level design for the area that they are responsible for.  The document will include every aspect of the implementation such as firmware versions, diagrams and test plans.  They will also confirm exact requirements for the bill of materials.

The Solutions Architect generally reviews these documents to ensure that they are in the same format, the customer is referred to in the same name, the overall requirements are met and that any mistakes are rectified before customer release.

Responsibility: Pre-Sales 2 Post-Sales 8

Overall Score: Pre-Sales 35 Post-Sales 15

6. Implementation

This really is the realms of post-sales, who install and configure the solution and test it with the customer for sign off.

Not much more to say, apart from either it does what it is suppose to or it doesn’t!

Responsibility: Pre-Sales 0 Post-Sales 10

Overall Score: Pre-Sales 35 Post-Sales 25

7. Technical Ability

Expectations should be that post-sales technical skill set should be higher than pre-sales, although pre-sales will often have the same level certification with a vendor.  Pre-sales often lack the implementation experience, meaning that even though they could perform the installation and configuration it would take them a couple of days longer compared to their post-sales comrades.

Ability: Pre-Sales 4 Post-Sales 6

Overall Score: Pre-Sales 39 Post-Sales 31

8. Hidden Ability

I wasn’t entirely sure what to call this section, but these are the hidden things such as appearance, timekeeping, getting back to people, being able to word an email without offending the recipient and communicating to a customer when they are wrong without calling them a plonker!

This part is very subjective.  It is my personal experience, that pre-sales dominate in this area.  Not to say that post-sales are not good, they just seem to be far between.  Overall I have encountered far more post-sales people who are awesome technically, but you would only wheel them out in front of the customer when the deal is done.

Ability: Pre-Sales 7 Post-Sales 3

Overall Score: Pre-Sales 46 Post-Sales 34

Final Word

So the winner is pre-sales, why is that?

Pre-sales are the key to obtaining, winning and keeping customers.  Without pre-sales we wouldn’t need post-sales.  However if we take this full circle the actual winner is sales as without them we don’t have a requirement for pre or post sales.

Have your say, who do you think are better?