Today is our 10th year anniversary!

Today we are happy to announce we’ve officially been in business for 10 years! It was exactly 10 years ago today that we were incorporated (July 27th, 2006).

It has been 10 years that have passed by very quickly that have included many challenges and obstacles. Throughout the years we have gone from simply providing I.T. Services billed hourly transitioning in to a full Managed Services Provider back in 2011 that designs, sells, implements, manages, and supports I.T. Solutions and Infrastructure.

We’ve come to build expertise and specializations in technologies such as Storage, SANs, Virtualization, Infrastructure, Disaster Recovery, Remote Office Connectivity, and Security just to name a few, and have evolved with these expertise to benefit specific markets such as Homebuilding, Manufacturing, Oil and Gas, Service Providers, and numerous others.

In the past 10 years we’ve provided consulting, services, and advice to over 80 companies, 5 years ago trimming that number down to a select group of companies that required mission critical Infrastructure services and Managed Services.

We’ve partnered with some of the leading companies like HP, HPe, Microsoft, Sophos, IBM, Lenovo, Symantec, and Veritas that have enabled us to provide top notch best practice solutions for our clients, thus enabling them to manage and support these environments in a cost-effective manner, contributing to their business functionality, and providing a solid foundation for them to work on their bottom line.

We cannot say THANK YOU enough to our wonderful clients, and those who have worked with us in the past. We would also like to thank our vendors and the various channel partner support teams we have worked with during solution design, technical pre-sales, and supporting the products after implementation.


Cheers to another 10 years of success, and cheers to expanding to new markets and areas!


Stephen Wagner


Digitally Accurate Inc.

Ready to Virtualize? Here’s some important considerations…

When a business makes the decision to virtualize, a whole new world opens up. Virtualization increases the capabilities of performance, disaster recovery, uptime, management of servers, management of infrastructure, the list goes on.

However, you need to make sure you Virtualize properly!

Often businesses make ill-informed decisions either based on budget, or based on misinformation provided by I.T. professionals. I want to talk briefly about a few key points which should be in your mind before making the jump to a Virtualized environment in regards to design of your solution.

1)      Performance

In virtualized environments, there are 2 key points that usually bog down the infrastructure, storage and memory (RAM).


I’m sure you’re all familiar with the “thinking” light on computers, this is actually a LED that shows hard drive access (both writes and reads). You know how much it flashes when on a single computer when you’re doing things, now imagine 10-20 virtualized servers using the same storage system, the LED would be on solid. Enterprise (and server) storage is designed to allow more throughput and higher speeds, but keep in mind that under normal conditions even enterprise storage is designed for a single system or server to access it and provide resources to the network. This is why you really need to plan out and design your storage system.

All virtualization storage systems should be designed with virtualization in mind. With numerous virtual machines accessing the storage system (or SAN), the SAN not only has to provide high speeds and throughput, but also has to be able to process high IOPS (Input/Output per second). The storage has to respond to these I/O requests at very high speeds, all while providing high throughput of data access to each of the virtual machines. Right now for small/medium sized businesses, we recommend the HP MSA2024 SAN, which was designed for virtualization, provides extremely high IOPSs, throughput, and is extremely reliable with dual controllers. The dual controllers allow for multiple links from the SAN to the actual physical servers, this provides higher throughput, also redundancy in case a single link goes down.

There are a lot inexpensive NAS/SAN devices out there that advertise as VMWare ready, and while they are compatible, you will experience HORRIBLE issues in a production business environment with numerous VMs. I always say that these devices should only be used in labs, testing, or hobby environments. You need to expect that your SAN (including drives) will cost more than 1-2 servers.


I can’t stress enough how important it is to load your physical virtualization host servers up with as much memory as possible. Thankfully in the last few years, using the latest generation of HP servers, RAM has become EXTREMELY affordable.

Don’t plan your solution with ONLY the amount of RAM you’ll need for the virtual machines running on a single host. In a 2-3 physical host environment add up all the RAM all you’re virtual machines will use (across all hosts), take that value and multiply it by 1.5, the product of that should be the amount of RAM you should have in each physical host. Keep in mind, if two of your physical hosts die, you’ll want to move all the VMs from the failed hosts on to the healthy host which is still running.

Further comments:

In most virtualization environments for small businesses, I always recommend to load up each physical host server with (2) X 8 or 10 Core Xeon processors. While this isn’t a rule, I always like to make sure that the virtualized CPUs correspond with a physical core, and try not to have them shared (although it’s fine if you do).


2)      Disaster Recovery

This is one of the most overlooked topics in virtualization. A lot of professionals actually believe that snapshots are backups, they are NOT. Snapshots are used for testing, to rollback when applying patches, are involved with backups, but are not actual real backups. Snapshots are great, but they aren’t what you’re looking for. I actually prefer not to leave virtual machines in a snapshot state for performance reasons.

It’s critical to build a disaster recovery solution that will actually allow you to have the data (or backup media) off-site. This can be achieved by using a backup system that pushes backups over fiber to a remote location, or software like Symantec Backup Exec that will actually allow you to backup to removable disk, or tape storage for the larger environments. You want to be able to have multiple point in time backups so that you can restore a system or file from 3 months ago. You don’t want to be stuck with a single backup.

A good consideration is to utilize BackupExec to backup the virtual machines to disk then to tape. This will quickly back your virtual machines up, then move/replicate the backup to tapes which in turn can be taken off-site. I know tape backup technology has been around for a while, but it’s anything from old. The latest tape backup technology can store multiple times more data that removable disks can, they also offer superior read/write speeds versus hard drives. They are also super easy to transport.


In conclusion, keeping these key points in mind can help you implement the best virtualized solution. It’ll help you get the best bang for your buck, and will help you avoid some problems that most first-time virtualizing companies are making today.


Call us for more information, and how we can help you get started with virtualization.

Client Configuration Example

In an effort to better explain Digitally Accurate’s capabilities and demo some of the solutions we implement, support, and manage we’ve decided to start putting some of our client configurations online!

This specific configuration we put together for an Oil & Gas company.

Objectives and Requirements:
-Mass Storage (Client requires 5TB+ enterprise grade storage)
-Close to 100% up-time
-Disaster Recovery solution requiring full backups weekly, and daily differentials, must be taken off-site
-Security (Both network and Anti-Virus)
-Remote Access (VPN, RDP, and Mobile devices)
-24×7 365 Unlimited Technical Support (On-site and Remote)
-Remote Monitoring
-IT Management
-Pro-active Infrastructure Management and Maintenance
-24×7 Hardware Monitoring (Client requires immediate replacement)

-Microsoft Windows Small Business Server Premium Edition
-2 X HP Proliant DL360 G6 Server
-Sophos Astaro Security Gateway 220 (Full Guard Bundle)
-Symantec Backup Exec 2012
-2 X HP MSL2024 StoreEver Tape Library (1 X LTO-4 SCSI, 1 X LTO-6 SAS)
-2 X HP SmartArray P800 Controller
-HP U320E SCSI Controller
-Dlink DGS-1210-48 SmartSwitch
-Symantec Protection Suite
-APC Smart-UPS XL 3000VA Extended Runtime Uninterrupted Power Supply
-APC Smart-UPS XL 48 Volt Extended Runtime Additional Battery Pack
-Platinum Managed Services from Digitally Accurate


HP Rack Keyboard and Video Display

HP Rack Keyboard and Video Display

125TB (compressed) of LTO-6 Storage

100TB (compressed) of HP LTO-6 Storage

MSL2024 Tape Libraries with LTO-4 and LTO-6 Tapes

HP MSL2024 Tape Libraries with LTO-4 and LTO-6 Tapes

Sophos ASG and Tape Library

Sophos ASG220 and Tape Library

Servers and Storage

2X HP DL360 G6 Servers and 2 X MSA60 Storage Arrays

2 X MSL2024 Libraries

2 X HP MSL2024 Libraries

Company website has been updated!

As of today, we have rolled out a new website on or corporate site. Any feedback is appreciated!

Go to to check out the new layout and let us know what you think!

Digitally Accurate Inc. hits 6 years in business!

Digitally Accurate Inc. turns 6 years old today! Thanks goes out to our awesome clients and vendors who made this possible!

Go to for information!

The importance of keeping up-to-date IT Documentation (as well as Disaster Recovery plan/policies)

When going in to a new potential customer one of the biggest problems we always run in to, is that the customer has absolutely no documentation for their IT environment, no disaster recovery plan or policy, and no documents explaining licensing (licenses, licensing agreements, details, etc…).

In a healthy IT environment, one of the most important things you can have is documentation. Documentation provides the people who need to know with the information they require to operate, maintain, support, deliver, and protect your IT infrastructure and investment. This also allows anyone with proper skills/knowledge to perform those services in the event your usual provider may either be unavailable, terminated, switched or for some other reason can’t service you.

And No! We aren’t talking about listing usernames and passwords and calling it a day, we are talking about real documentation and a small investment in time. Below is a list of some fundamentals you should have documented:

Please Note: ALWAYS protect your documentation as it is a key to access your entire network. Very few copies should exist, however the ones that do exist should be located in separate locations which are SECURE, and only protected by someone who has a vested interest in the companies well being and future.

Administrative Credentials – Most importantly, record the main master domain Administrative account (username, password, and domain), and any other administrative accounts. If any users are provided with special administrative privileges, be sure to list them as their access needs to be revoked if they are ever terminated or leave the company. If not, you are leaving a major security hole open. I’ve seen some companies leave administrative full access accounts open for users years after they left the company, this isn’t good!

Contact Information – Be sure to list contact information for various roles inside the company. Contact information for decision makers, and technical people should be recorded (along with after hours contact info) to make sure that if in the event something major occurs, a decision maker can be reached on the phone. The last thing you want is an emergency to occur, and it not being resolved since an IT person can’t get someone on the phone who can authorize a hardware purchase…

Server Configuration – Your server configuration should include all types of information (no matter how trivial). This includes networking information, administrative account details, details about Active Directory (ex. Domain name, WINS configuration, DNS Configuration), computer name, and built-in service that may be enabled (ex. DHCP, DNS, WINS, RDP, RRAS, RIS, etc…), even DHCP reservations. This provides the technical picture that shows how things were set up, and what base services are set up and providing your network by your servers. Don’t forget to include information on your server hardware. You should list all information on your server hardware, such as model numbers, serial numbers, warranty information, along with contact information on how to initiate warranty cases. Disk configuration, RAID configuration, firmware info should also be documented. This documentation should allow someone to re-setup your servers from scratch if required.

Network Documentation – Your networking documentation will provide the reader with how devices are dynamically configured (whether DHCP, BootP), how many devices are on the network, what networks they sit on, who has access to RDP, VPNs, how those services are accessed (IPs, hostnames), and details on how standard network services are delivered throughout your network. This documentation should provide information on how your network functions and how data goes across it. And just like your server, be sure to include models numbers, serial numbers, warranty information, and location of the hardware as well. This will provide the blueprint in the event the network requires repair, or needs to be rebuilt from scratch.

Network Shares – Document Network shares, file system location on the host, permission types, and description of the purpose of the share. It’s best practice to present network shares to a group of users using the same letter network drive, document this as well so it can easily be managed, or re-setup in Group Policy of required (automatic drive network mapping).

Workstations – Your workstations should be somewhat generalized and centrally managed. Most computers should be running the same software, based off similar hardware, and configured in the same generalized fashion. Be sure to document the process for initial configuration, required software, software configuration, and any other special options that certain users may require. It’s always good to record model numbers, serial numbers, and warranty information so it can be found quick when needed.

Users – Be sure to keep an activate list of users, along with group information, department, and types of access they require, e-mail addresses, and e-mail distribution lists they may be a member of. NEVER record users passwords as this is a security concern. Any work ever required by an IT professional should be easily accomplished with the Administrative credentials, even if the administrator is required to sit down with the user.

Printers – Be sure to record all printers, MAC addresses, and DHCP reservations you have configured for your printers. As always with all hardware, record models, serials, and warranty info. It also never hurts to record driver versions you have installed just in case it may assist with troubleshooting.

Licensing Information – Be sure to keep information on any major licensing agreements you have in place. Also be sure to document URLs, usernames and passwords for any licensing systems that are managed online (such as Microsoft eOpen Online Licensing, Symantec, Astaro, etc…). It’s good practice to record all licensed products, keys, and invoice numbers for the purchase of the license. In the event you may get audited, require a license key for re-installation, etc… this information will be invaluable to get it resolved quickly. This document will also clearly let you know how many licenses you have, who is using them, and what versions of software you are running.

Firewall Configuration – Firewall setup and configuration should be documented, along with licensing keys, port forwarding configuration, and firmware versions.


A separate document that should exist is your documentation for “Disaster Recovery”. This documentation goes in to detail as to what your backup and restore policy is, how you back up your systems, how to restore your systems, and what to do in the event of a total failure where a recovery from scratch is required. If you can’t create a document that provides a step-by-step process to completely restore your businesses IT infrastructure then you are in trouble! Either find someone who can create the documentation, or implement a disaster recovery solution that can be properly documented. This documentation also allows you to test your disaster recovery solution, which you should be doing from time to time to make sure you can recovery from a failure, and that the documentation is correct!

While everything above is a good starting point for small businesses, feel free to add any other information you feel could benefit your documentation. All information is good information as long as it is organized and easily readable! Happy documenting!


If you need help with your IT Documentation or IT Services, feel free to contact us!

Virus that only infects your RAM, uses no files, and is very hard to detect! Infects via Java Vulnerability!

We are writing you today to just re-iterate the importance of keeping your systems up to date. And when we refer to up to date, we mean all Microsoft Updates, 3rd party updates, Java Updates, Adobe updates, etc…

Today we became aware of a new virus that ONLY infects your computers memory, using absolutely NO files on your hard drive. Since the virus only infects the RAM and uses no files, it’s very hard (maybe impossible for some anti-virus scanners) to detect and remove the virus. This virus is being distributed via an Ad network, that when viewing legitimate websites that carry an Advertisement containing the malicious code, infects your computer.

Good news is, when you restart your computer, you are no longer infected. Bad news is, since you won’t be aware of where you got it, or even the fact you got infected, you are very likely to get re-infected by visiting the same, or similar sites.

The virus infects your computer by presenting code in an advertisement which contains Java. If your Java is not up to date, this code will exploit a vulnerability in older versions, resulting in infection.

This is why it’s always important to keep your software up to date, it doesn’t hurt to have a firewall (like the Sophos – Astaro Security Gateway) which can also intercept virus’ and malicious code before it causes an infection.


RDP Vulnerability – Update your Microsoft Windows Servers ASAP

Microsoft is urging companies to install a Microsoft Security Update.

MS Security Bulletin MS12-020 is marked as critical and patches a security vulnerability in the Remote Desktop Service. Remote Desktop Services (RDP) is used by both users and IT admins.

IT admins use RDP to connect remotely to a Windows Server. Users (you) use RDP to remotely connect to your work computer.

We recommend to install this as soon as possible. A Proof of concept hack was already released, and it is expected that this vulnerability will widely be utilized by hackers and bots immediately since there is such a wide range of businesses that use RDP, and do not actively keep their system up to date with security updates.


You can install this patch by running Microsoft (and/or Windows) update on your workstation and server, or by visiting the MS bulletin linked above in this article.


The Cloud… Where is IT leading you?

Let’s face it, everywhere we go these days we hear the term “The Cloud”. But what does it mean? What does it do for us? Is it for home folk to use to store their music? Or is it something viable a business can use to enable information flow and utilize it to do business faster, cheaper, and ultimately better?

Well I’m hoping to shed a bit of light on this topic. Let’s start with what “The Cloud” is exactly…

Original Definition: The Cloud is a group or collection of resources which are available to users on demand. Traditionally this didn’t mean just through the internet, but rather over different types of networks. Basically it was accessing data from a pool of resources which were allocated to provide the data. From the get go you can tell it’s a very loose definition, which can cover technologies that existed before “The Cloud” even existed, odd huh?

Today’s Definition: A means to deliver a software, service, or platform from a cloud of resources available, sold by a reseller or provider. Usually sold on demand by the number of seats (or users), and by the time used.


In the beginning

In the beginning, there was a company that was pioneering a new technology called virtualization. This virtualization technology allowed a physical server to run more than one operating system simultaneously. Ultimately this technology allowed a server, to actually be 10, 20, essentially numerous servers packed in to one physical server. All of a sudden, we created a pool of servers, inside of one server. In a datacenter, there could be numerous physical servers, each running numbers virtual machines (virtual servers). This is where the term “The Cloud” originally started being used.

IT staff could create servers, move servers, re-organize servers in “The Cloud”, and the services and data on these servers inside of this cloud would be provided to users. Essentially you’d be using “The Cloud” of servers.

Virtualization has become an amazing technology which in our opinion is mandatory for any company with multiple servers. In a way you could say it’s a next-generation technology that’s available today!

And let it be said we fully agree with and support these technologies. Digitally Accurate specializes in virtualization, and this virtualization technology helps businesses every day perform business faster, better, and more effectively.


Then what?

Over time as more companies adopted virtualization, the term “The Cloud” caught on more and more.People were doing more and more things with making information available anywhere, whether it was music, business data, pictures, you could access it from anywhere!

Now fast forward to the recession. At this time, lot’s of large corporations minimized their IT budgets, and the majority of Small to Medium sized businesses virtually eliminated their IT budgets. IT Solution and Support providers were no longer making sales, or selling solutions that cost what was now considered a fortune. IT providers in the interest of self-preservation had to find a new way to make money and the providers who were still doing well during the recession, needed to find a way to target small to medium sized businesses with their new budget limitations. With the expertise in these technologies and understandings of the way they worked, IT providers started to figure out that their potential clients were more interested in penny pinching, then looking at the bigger picture. Why not develop a solution for these companies that allowed them to pay small amounts monthly, which was secured by a long-term contract (that’s inescapable), and that allowed the IT companies to provide the services to a customer that would normally be provided by the customers own infrastructure.

This allowed the IT provider to provide services from their own servers to multiple clients, essentially squeezing out every ounce of performance possible from equipment. The re-occurring revenue secured their existence, and the length of the locked-in contracts made sure clients couldn’t leave once they were drawn to the low setup costs and monthly costs of the service.

Often, the client’s weren’t sold everything they were looking for and since they were locked in to a contract they couldn’t change providers after being duped. Essentially, they would have to pay more and more to actually get the services they thought there were getting in the first place. This allowed the IT providers initial quotes to remain low, look cost-effective, but open the door to major profits once contracts were signed.

Over time this model caught on, a whole industry was created, from the companies that owned the servers in the data center, to the companies that setup, managed and maintained the physical servers, to the companies that applied “The Cloud” or “Software as a Service” model and provisioned the cloud, to the company that ultimately resells the services.


Fast Forward to Today

Businesses are often lured in by the cheap setup costs and cheap monthly costs thinking they are getting everything their businesses need. The moment a business signs off on a contract to implement these services, they are usually immediately passed off to another company up one tier in the Cloud industry I mentioned above. Usually all support, maintenance, and other services are provided by an entirely different company than the one which you signed the contract with.

The reseller has officially made their profit off your signature and can now walk away. In some cases you won’t hear from them unless you need to purchase more services, licenses, etc… Ask yourself, is this a business relationship? Or more like a door-to-door salesman?

Most companies aren’t told that all their information they will be accessing from a server in who-knows-where requires a lot of bandwidth. If you have people accessing simple things like e-mail, word documents, etc… if you have numerous users doing this, it can halt all your productivity to a stand still unless you of course upgraded you internet connection to handle this (wow, there’s some more additional costs).

But wait, what if the internet goes down? Well, you’ll need a redundant connection or you won’t have access to any of your data (wow, more costs???). If you don’t have that redundant connection, you won’t have access to your data. Be sure to make sure it’s fast too, so you don’t have people waiting 30 minutes to open an Excel spreadsheet.

What about disaster recovery? Do you know if the company is actually backing up your data? Do you know what their policies are on recovering that data? In this industry, it’s about signatures, and profits. Pushing clients through the pay door, do you think their main focus is backing up your, and all their other clients data precisely?

Where is your data being stores? Lots of cloud providers actually use data centers in the United States. Do you need to be concerned about the patriot act?

Have you heard all the news about the shutdown of the “MegaUpload” service? That was a cloud based service. It got shut down, do you think paying customers were given the opportunity to retrieve their data before it was taken offline? And wait, you mean the same server that is hosting your data, might be hosting illegal content?

What about security?


What if, what if, what if…


The Cloud… Where is IT leading you?

I leave you with this thought…

Cloud based services to share music, pictures, and dumb personal stuff? Great, awesome technology, simple, etc… Nothing too important. If it goes down, who cares.

Cloud based services for your business, confidential information, intellectual property, financials, mission critical information? Wait a sec, this reminds of of Russian Roulette.


Is a SAN out of your budget? Think again!

With many businesses entertaining the idea of said technologies such as virtualization and/or high availability clustering, often what holds them back is the cost of implementing a SAN.

A SAN for those of you who don’t know; is a “Storage Area Network”. The network and equipment contained in it provides “shared storage environment” often required for SQL 2008 clustering, virtualization (for features such as High Availability and Live Migrations), etc…

Lately, on my personal blog I’ve been going in to quite a bit of detail regarding emerging iSCSI target technologies and way to minimize the costs of implementing a SAN. While most of my focus has been on Lio-Target A revolutionary linux based iSCSI target, a new player has entered the field, the “Microsoft iSCSI Target Software” which as of now is free!

While I’ve done extensive testing with Lio-Target and found it to be fully functional, promising, and extremely stable (and as of recently is now a part of the mainline linux kernel), I’m interested in what Microsoft has to offer. For those of you who are looking at purchasing/building a SAN, this now offers a second choice.

This is great news that will help alot of companies re-evaluate their decision not to implement a SAN due to cost.

More Information on Microsoft iSCSI Target for free at: