Thursday, November 25, 2010

vCO - get performance data from VM & build graph

Retrieving performance data from an entity is a little bit complex not only because of the nested parameters. To reduce code and make it easier for starters to step in, we just grab the average CPU in MHz from a VM about the last hour and build a graph like this:
For this example the code is placed in one scriptable task. For production purpose it is mor sensible to split it in several actions & workflows.
  • Input Parameter: VM [VcVirtualMachine]
Part I - retrieve performance data

setting interval:
var end = new Date(); // now
var start = new Date();
start.setTime(end.getTime() - 3600000); // 1h before end
System.log (end.toUTCString());
System.log (start.toUTCString());
Look at the logged time stamps. They are in UTC and later on also in graph. If you want to adjust this to client time, you have to recalc the time stamp for CSV using Date.getTimezoneOffset().

create querySpec (here for only one VM)
var querySpec = new Array();
querySpec.push(new VcPerfQuerySpec());
querySpec[0].entity = VM.reference;
querySpec[0].startTime = start;
querySpec[0].endTime = end;
querySpec[0].intervalId = 20; //or use 300 for 5 minute stepping
create perfMetricId for one metric (CPU average in MHz) and call perfManager

var PM = new VcPerfMetricId();
PM.counterId = 6; //6 = cpu.usagemhz.average
PM.instance = ""; // no instances
var arrPM = new Array();
querySpec[0].metricId = arrPM; //assign PerfMetric to querySpec
querySpec[0].format = "csv";

var CSVs = VM.sdkConnection.perfManager.queryPerf(querySpec);
Now the array CSVs contains one VcPerfEntityMetricCSV object - we only called one - nevertheless i will iterate over CSVs so you can reuse the code

Part II - join data and time stamps in a CSV file
for (i in CSVs)
    var CSV = CSVs[i];
    var Temp = CSV.sampleInfoCSV.split(",");
    var Sample = Array();
    for (j in Temp)
        if (j % 2 != 0)
        //only use odd entries, they contain the sample time - even ones contain interval
    var Values = CSV.value[0].value.split(","); // the MHz values
    var BaseName = "C:/Test/" +;
    var CSVname = BaseName + ".csv";
    var ControlName = BaseName +  ".control";
    var PNGname = BaseName + ".png";
    var FW = new FileWriter(CSVname);;
    FW.lineEndType = 1;
    for (j in Sample)
           FW.writeLine(Sample[j] + "," + Values[j]);
Using to build the file names makes them individual. So you can call the workflow parallel without having duplicate file names.

Part III - generate graph
To do this there are some requirements:
  • enable local execution for vCO
  • download gnuplot and unzip - in this example it is unzipped to C:\test\gnuplot
First we have to build the control file for gnuplot for manipulate graph rendering. I've changed only some basic parameter - if you are familiar with gnuplot, just add more parameters to get a better look. At the end we just call gnuplot with our control file.
Pay attention on the single quotation marks. We have to mix them up to get the double ones in control file.

    var FW = new FileWriter (ControlName);;
    FW.lineEndType = 1;
    FW.writeLine ('set datafile separator ","');
    FW.writeLine ('unset key');
    FW.writeLine ('set title "performance data ' + + ' [' + + ']' + '"');
    FW.writeLine ("set terminal png");
    FW.writeLine ('set output "' + PNGname + '"');
    FW.writeLine ("set xdata time");
    FW.writeLine ('set timefmt "%Y-%m-%dT%H:%M:%SZ"');
    FW.writeLine ('set format x "%H:%M:%S"');
    FW.writeLine ("set xtics rotate");
    FW.writeLine ('set ylabel "MHz"');
    FW.writeLine ('plot "' + CSVname + '" using 1:2 wi li');
    FW.writeLine ("quit");
    var cmd = "cmd.exe /c C:\\Test\\gnuplot\\binary\\gnuplot.exe " + ControlName;
    System.log (cmd);
    var CMD = new Command(cmd) ;
    var Result = CMD.execute(true);
    System.log ("GNUplot: " + Result);

That's all - feel free to leave a comment - regards, Andreas

Tuesday, November 23, 2010

vCO - rights management for WebViews

Yesterday i try to publish a WebView which should be used by only one user group of my Active-Dirctory. After a few attempts i decide to describe the whole process with some pictures. So the goal for todays article is to get only access to the workflow based under "Customer2".

At first you have to define the rights at the root object (Edit access rights....):

Because of the rights heredity you have to enable minimum the "View" right for all objects.

In my case a set a view more. After setting the rights for my user group: "Benutzer" which is an Active-Directory group every folder in my hierachy inherits the rights. If you log in to the WebService portal for example, every user in "Benutzer" can view, execute and inspect all folder.

When setting the rigths at the root object is done you have to edit the access rights for the folders you whish to hide. Similar to the steps at the root object you have to select "Edit access rights..." on the folder you want to hide. As you can see the folder has inherited its rights from the parent object (root). Now you have to set the rights, or better the restriction to the folder.

Restrictions in child objects are set by deselecting the rights (cruel sentence...). So deselect all rights and choose the same user group "Benutzer" as before.

After that you can verify the settings and press "Save and Close". Now do the same step for alle folders you want to hide.

In my example the "Customer2" folder is an child of "Customer". Regarding this my parent folder "Customer" needs all the rights set in the root object. If you change the rights here it will affect the child folders! Next we hide my "Customer1" folder because my users should only see workflows in "Customer2". You can do this exactly the same way as for the other folders.


As done before we "Edit access rights..." and deactivate all rights for the "Customer1" folder.

After that the child object has no rights and prevails to the parent object. On the "Customer2" folder you have nothing to change (if the parent rights in root are the right ones) because it is visible and the workflows can be executed and inspected.

Now you can logon at the WebViews portal with a user from the Active-Directory group you have enabled ("Benutzer").

In my case the user "Raketen RJ. Joe" can now create a simple virtual machine with his user rights in my vCenter Orchestrator WebViews :-) #
I hope this simple instruction helps you to design a rights management for you administration or user team.

Monday, November 15, 2010

ESXi - esxtop/resxtop and perfmon

Last week I had an appointment with one of our customer who wants to know more about esxtop/resxtop. So I decide to write some, in my eyes, important things to know. Because of the simple display esxtop wasn´t the tool for me to troubleshoot in the past. But with the enhancements in vSphere 4.1 there are some useful possibilities.

With this article iI want to show how to use esxtop in batch mode and how to evaluate the collected data in perfmon.

First we start with the ssh connection through the host (don´t forget to enable it on the security settings in the host configuration screen).

After that we start esxtop and create an new configuration file which only collects cpu metrics. You can do this easily by pressing "W" and name the new configuration file //.esxtop_cpu in my case.

After the configuration file is written you can close the esxtop screen by pressing "q". Now you can start the data collector with a simple command:

esxtop -b -c //.esxtop_cpu -n 10 > testcpu_10intervals.csv

The "-b" switch is for the batch mode, the "-c" switch is for the configuration file, the "-n" switch is for the intervals (be careful with more because of the disk usage) and the > pipes the data into the .csv file.

After the execution is finished there should be a .csv file with the collected data. This file we copy to a windows workstation and open the perfmon (in my case a Windows 7). Inside the performance monitor there is a button called "View Log Data" which allows you to import the .csv file.

After adding the .csv file you have to add the data (data tab)...

and the display options (Graph tab) in the performance monitor.

After the performance metrics (in my case: physical cpu (0,1)) are choosen the graph will be showed in the performance monitor. Now you are able to look the ESXi performance data in a historical graph.

Sunday, November 14, 2010

vCO - XML post with pre-authentication

If you want to post XML data to webservices wich need pre-authentication you need a workaround, because vCO does not support this feature.

In my case, I had to post XML data to CA unicenter without activated SOAP interface, only authenticated post was available. So I wrote this little app XMLpost and call it from vCO.

Read more on original post here:

Saturday, November 13, 2010

VMware VCAP-DXA exam experience

Today was the day! After travelling the long way to Frankfurt and reading every guide from VMware at the train i arrived at the VUE test center around 1:00 pm. After the normal check-in procedure i take my seat in front of an old DELL client. After 10 questions in the survey the exam begins...

After 3 hours and 30 minutes my eyes were dry and red and my head begans to burn. I think it was one of the hardest exams i had made in the last years. So i try to give you an overview what drives me mad:

1.building performance reports and statistics
I think nearly every question has to do with performance metrics, performance reports, utilization reports and so on. Also there are several questions to build custom reports and performance graphs like: build up an DRS Cluster, put the hosts into it and design a performance chart for all memory operations (average, used etc.). Sometimes the description doesn´t match the really available options...

2.vMA administration
Another strong part were the vMA questions. Several tricky things are asked: delay in the vMA input via vSphere Client console, esxupdate with NIC drivers (did not work in my test environment!), verifying an ks.cfg file (was not available on my hosts, should be available under /tmp on one of them!), esxcli to create an SATP ALUA rule (New Array does not work as name, because of the space between!)

In my case there was only on script to build: Find all VMs with a CDROM attached and write the list into a file.

4.Standard operations
The most things i had to do where standard vSphere operations: build up a vDS with several port groups, change Uplink orders in vDS port groups, build up an vApp and a resource pool with explicit configuration tasks (start order, reservation under 25%, 5VMs start without expandable resources).

In my case there were no question about the vShield Zones, the vCenter Orchestrator or the Linked Mode, but i really missed them because the vMA questions were really hard! I think i will need a second attempt to get the certification :-(

UPDATE: Today the mail arrives and what should i say: i passed the exam!!!

vCO - dvPortGroup, uplink or not

I had to determine if a distributedVirtualPortGroup (dvPG) is an uplink or not. E.g. to filter uplinks from datacenter network list.

I found nothing suitable, so I wrote this action:

You can find the original post here.

Thursday, November 11, 2010

HDS HNAS Platform 3x00 - undocumented NFS feature

If you are using a HNAS from BluARC(HDS) and a vSphere environment which supports Thin-Provisioning there is a possible problem: While deploying a virtual machine with thin-provioned disks (thin provisioning is displayed as supported for the datastore) the deployed virtual machine shows up thick virtual disks after creation.

If this happens, there is an undocumented feature for the NFS server parameters:


After the counts are enabled the documentation of the provisoned space into the vSphere environment is correct.

Saturday, November 6, 2010

white paper - vCO and secure ldap

Based on the problems at one of our enterprise customers, i decide to releas a white paper which describes the possibilities to implement the vCenter Orchestrator in a secure ldap environment.

The download is available at our Mightycare website:

Please note that the described solution is a workaround, i prefer the more secure PKI based solution.