Citrix Announces GoToManage Monitoring for XenServer

Citrix Systems announced Citrix GoToManage Monitoring for XenServer. The latest capability of GoToManage solves a key challenge for SMBs and IT consultants in monitoring both physical and virtual servers distributed among dispersed offices and data centres.

GoToManage is the new integrated offering for IT professionals and SMBs who need to monitor and manage a range of devices in support of increasingly distributed, remote and mobile workforces. It builds on Citrix’s number one market position in remote support to provide a new complete remote support and monitoring solution for both attended and unattended computers.

GoToManage consists of a real-time remote support module and a monitoring module, which provides detailed real-time server monitoring, inventory and software discovery, and network tracking. The addition of GoToManage Monitoring for XenServer offers a new single view to track the health and performance of all key physical and virtual IT infrastructure. The service is cloud-based, enabling SMBs to tackle the complexity of managing their servers without the added expense of deploying expensive hardware and software probes.

What Is New for GoToManage:

  • Tracks all XenServer host and virtual machine configurations, monitoring their health and performance
  • Provides one unified view of all distributed physical and virtual machines
  • Discover and inventory physical and virtual servers and view detailed information about their configuration, health and performance
  • Provides instant visibility into what virtual machines are running on a given host and detailed information about them

Why it matters

  • Makes what was invisible visible, by presenting detailed availability and latency information on all aspects of Windows®, Linux®, and XenServer servers
  • Eliminates downtime via automated alerting based on customer-defined critical machine and process thresholds
  • Real-time resource monitoring helps to predict capacity bottlenecks and improve overall utilization

“With the rapid growth of cloud computing and virtual infrastructure, IT professionals need a single solution to monitor both physical and virtual machines. Our new XenServer monitoring capabilities take only a few minutes to set up and reinforces our commitment to making the lives of SMBs and IT consultants easier.” Elizabeth Cholawsky, general manager and vice president of IT Services for the Online Services Division of Citrix.

“SMBs and IT professionals increasingly need to provide support to their customers in distributed locations in an affordable and efficient way. Online remote monitoring and management tools such as Citrix GotoManage Monitoring for XenServer that can visualise, monitor and arrange physical and virtual server resources can cut costs for SMB IT professionals, service providers and their customers.” George Hamilton, principal analyst at Yankee Group.

Pricing and Availability

The XenServer functionality is included at no additional cost for all GoToManage Monitoring customers. Also, GoToManage Monitoring has a free version that allows anyone to monitor two servers and inventory and track up to 18 other devices (e.g. computers, switches, routers).


Source: VMblog


APCS is a premier IT consulting services and solutions provider that help clients plan, build, optimize, and support mission critical IT infrastructure.
We specialize in Microsoft, Citrix, VMware, Symantec, and other IT products and professional services.

For more information: +971 2 6350210 |


vSphere vs Hyper-V vs XenServer

Virtualizationmatrix did a good effort listing vSphere vs Hyper-V vs XenServer features.. You can compare and even change versions

Why the Cloud Is Actually the Safest Place for Your Data

Simon Crosby is the CTO of the datacenter and cloud division at Citrix Systems, Inc. He was founder and CTO of XenSource prior to the acquisition of XenSource by Citrix. You can read more on his blog and also follow him on Twitter @simoncrosby.

Worried about your data? If you’re not, you’re kidding yourself. It’s become clear over the past few months that the risk of security breaches has reached a new and frightening level — from sophisticated tools in the hands of national governments and organized crime to spontaneous attacks harnessing the resources of thousands of loosely connected vigilantes. Add to that the dizzying array of devices now used to access, move and store data. Security strategies that seemed airtight only a few years ago now look like so much Swiss cheese.

In this light, your first instinct might be to pull back from cloud computing, viewing it as inherently less secure than keeping data and applications locked into hardware. After all, the word “cloud” itself implies that your precious assets are out there floating around somewhere, right? It’s an understandable reaction and one that couldn’t be more wrong. In fact, the cloud is now the safest place for your data.

Think about it: Data is lost when an organization loses control over it, including how it’s stored, how it’s transmitted, and what end users do with it. Clouds, and the virtualization technologies on which they run, give you back that control, from data center to delivery to endpoint.

Deliver User Experiences, Not Vulnerable Data

A key tenet of security is making sure data doesn’t go astray when it leaves the enterprise. But what if data never left the enterprise in the first place? Desktop virtualization means that all data, applications and state remain centralized; users can access an immersive experience indistinguishable from traditional computing (actually even better in some regards, like instant-on apps) using either a hosted desktop or application experience, or a rich client experience. IT gains precise, granular control over applications and data. Everything is encrypted at rest, using keys that never leave the data center. Meanwhile, full back-end automation means less human involvement and less human involvement means less chance of things going wrong.

A locked down data center is all well and good, but how are workers supposed to be productive if they can’t move data around? With virtualization, data is available from multiple points. Accordingly, there’s never a reason to save anything to removable media (like the kinds that seem so often to fall into the wrong hands). A good desktop virtualization solution lets you set policies as to what kinds of client-side devices can be used, from thumb drives to printers.

What about offline use? No problem. Any data delivered to the desktop cache remains encrypted at all times, and IT holds the keys. Lost laptop? Disgruntled employee? Hotel room theft? Not to worry.

A New Perspective on Endpoint Security

A moment of silence, please: Traditional endpoint security is dead. It’s simply no longer possible to detect attackers faster than they can mutate, and managing antivirus protection guest-by-guest can’t possibly scale. It’s also fundamentally incompatible with virtualization, since we can’t have every endpoint in the organization trying to update a centralized attack file and index its virtual hard disk at the same time. Symantec, it’s time to rethink your business.

What if we take the reverse perspective? If we can’t make data invulnerable, what if we make attacks less relevant by ensuring that each endpoint is in its best possible state? When a hypervisor is booted, one of the first things it does is check that it hasn’t been modified since it was last signed by its creator. The same applies for each virtual machine. After each login, each VM is returned to its original state, so attackers have no way to gain a foothold in your environment. This approach — essentially, moving from blacklisting to whitelisting — is a fundamental shift in endpoint security.

There’s still an important role for the security vendors to play in making virtual desktop security simpler and more scalable for large enterprise deployments, such as integrating in-hypervisor threat detection into both client-side and server-side virtualization products. Some of the top security providers are already doing exactly this, working in tandem with virtualization solution vendors. More will follow suit or find themselves stranded in an outdated and shrinking space.

Deny DoS Attackers

Even the best data security can’t protect against a denial-of-service attack. You know what can? Truly massive perimeter control. But don’t start pouring your own concrete yet. Why do you think people started keeping their money in a bank instead of at home? Because the bank has a better safe. So does Amazon. It’s even better, as we’ve seen, than PayPal and Visa. The largest cloud providers have defense resources far beyond anything you could match in your own datacenter.

Any way you look at it, the bottom line is clear: The online world may be getting more dangerous by the day — but the cloud is safer than ever.


Source: Mashable

SaaS vs. Traditional ERP: Five Key Differentiators

The beauty of modern ERP packages is that you have options to choose from during your ERP software selection. You can either deploy a solution by hosting it internally on your own servers (“traditional ERP”), or perhaps you would rather not deal with the software and have it hosted somewhere else (“Software As A Service” or “SaaS”). As part of your ERP selection process, however, you should be aware of five key variables that will ultimately factor into the decision that is right for you:

1) Simplicity. In general, SaaS is simpler to deploy from a technical perspective. Because you don’t need to purchase additional servers or physically install the software in yourself, it can be an easy and quick means of deploying the software. On the other hand, the high level of technical ease may create additional business complexities that you may not otherwise experience with traditional ERP (see #2 below).

2) Flexibility. Because traditional ERP is installed on your servers and you actually own the software, you can do with it as you please. You may decide to customize it, integrate it to other software, etc. Although any ERP software will allow you to configure and set-up the software the way you would like, SaaS is generally less flexible than traditional ERP in that you can’t completely customize or rewrite the software. Conversely, since SaaS can’t be customized, it reduces some of the technical difficulties associated with changing the software.

3) Control. Because of #2 above, many companies find that they don’t have control over SaaS software as they would like, relative to traditional ERP. This is especially true of mid-size or large companies with well-defined business processes that are not able to be changed to fit the software. Smaller companies generally are able to adapt their business processes to the software easier than a larger organization.

4) Accessibility. Since SaaS is entirely accessed through the web, you are in a world of hurt of the internet goes down. Alternatively, traditional ERP does not require internet reliability, provided your users are accessing the software from inside your company’s network.

5) Cost. In general, SaaS can be deployed at a much smaller initial cost, which can be attractive to smaller businesses. However, the ongoing annual payment can be higher for SaaS becuause you’re paying to use the software. Much like leasing vs. buying a car, that payment never goes away as long as you’re using the software and can become costly as you grow and add employees to the system.

Clearly, there are tradeoffs between the two options. The above five factors should be thoroughly prioritized and evaluated as part of any effective ERP software selection project.


Source: Toolbox


APCS is a premier IT consulting services and solutions provider that help clients plan, build, optimize, and support mission critical IT infrastructure.
We specialize in Microsoft, Citrix, VMware, Symantec, and other IT products and professional services.

For more information: +971 2 6350210 | |


Microsoft and HP launch a series of datacenter appliances

The partnership between HP and Microsoft to develop new appliances for the datacenter came to fruition with the introduction of the first of four appliances based on HP hardware and Microsoft software.  The first appliance to ship will be the HP Business Decision Appliance which is available today.

This BI device combines Microsoft’s Windows Server, SQL Server, and SharePoint in a single box designed to be up and running in less than an hour. Focused on the analysis and decision making processes that define a business intelligence solution, this first appliance will allow for a rapid implementation of BI systems throughout an organization, with a standardized set of applications and hardware.

Due to be released within the next 6 weeks is the second appliance from this collaboration, the HP E5000 Messaging System for Microsoft Exchange Server 2010. This appliance uses the latest in Microsoft messaging software on new HP hardware with support for significant DASD capacity, allowing a plug and play email solution to be deployed in your datacenters. The appliance includes the fully redundant hardware and support for database availability groups that allows Exchange Server 2010 to continuously replicate data and support a highly reliable and available email installation. The appliance will be sold in pre-sized versions starting with a 500 mailbox unit and growing to 3000 mailboxes, with the ability to span across multiple appliances for larger environments.

The remaining two appliances scheduled for 2011 are, like the Business Intelligence appliance, based on Microsoft SQL Server, and are the HP Database Consolidation appliance and the Business Data Warehouse appliance. Actual release dates are not yet announced for these devices.


Source: ZDNet