Since I migrated my blog to the new service provider I ran into a few problems.
WordPress.com Stats Plugin
First of all I was unable to post from Live Writer because of an error I kept on getting. The error looked like an error with something that had to do with XML-RPC support. Which –ofcourse- I had enabled.
After a bit of testing I found out that the “WordPress.com Stats” plugin caused this plugin and after reading the release-notes of this plugin carefully I noticed that the plugin was tested up to WordPress version 2.8 and not to 2.8.2.
After the problem above was fixed I noticed another problem. After posting from Live Writer now went error-free it seemed that HTML-tags in the posts I posted became somewhat incomplete cause the < and > characters were stripped from the HTML-code. Which resulted in incomplete or distorted messages, since it also happend when you edited messages in Live Writer.
While in fact in the organisation I work for we have several locations throughout the country and several VMware ESX Servers running on each one which we’ve divided in different Datacenters in our VMware Infrastructure.
So for instance you can also use the following line to list all Snapshots in a specific Datacenter:
When using NFS for your datastores on your SAN (eg. NetApp) and you resize your datastore on the fly it always takes a while before the ESX/vSphere servers refresh the datastore so you can see the new size of the volume.
This can be annoying when you want to create a VM on one of these volumes and ESX still thinks the volume is too small because it hasn’t refreshed it’s datastores yet. Normally I would pick one ESX server where I want to put the VM on for starters and refresh the NFS volume which I want to use:
Ofcourse I wanted to make it easier for myself so I wrote a little PowerShell script to refresh the datastores on all ESX Servers in our Cluster. I think it’s also possible to make a oneliner out of this and I will try to create one and if I succeed I’ll post it again
This “vulnerability” is somewhat unavoidable. The thing is that this tool keeps on opening simple HTTP connections which is –even if the webserver limits the maximum number of connections- really unstoppable. As the quote from the Slowloris website explains pretty clearly too.
Slowloris holds connections open by sending partial HTTP requests. It continues to send subsequent headers at regular intervals to keep the sockets from closing. In this way webservers can be quickly tied up. In particular, servers that have threading will tend to be vulnerable, by virtue of the fact that they attempt to limit the amount of threading they’ll allow. Slowloris must wait for all the sockets to become available before it’s successful at consuming them, so if it’s a high traffic website, it may take a while for the site to free up it’s sockets. So while you may be unable to see the website from your vantage point, others may still be able to see it until all sockets are freed by them and consumed by Slowloris. This is because other users of the system must finish their requests before the sockets become available for Slowloris to consume. If others re-initiate their connections in that brief time-period they’ll still be able to see the site. So it’s a bit of a race condition, but one that Slowloris will eventually always win – and sooner than later.
I’m not sure if there’s anything you can do to prevent an attack caused by this tool and I don’t think there is a solution to protect yourself from such.
As you might understand I advise you not to use this tool on public websites or such but only for educational purposes to see if you can find a way to protect yourself from it.
Also the attacks you do with this tool can be traced back to you cause the HTTP connections you create by using the tool will always be initiated from your own source IP address.
Here’s a useful tip for anyone who enjoys the handy debugger in the PowerGUI Script Editor during their intensive PowerShell scripting activities
The most useful feature while debugging is the “Variables” window in the editor.
It allows you to see all the variables/objects that you declared and look inside them to see what data they contain.
The only problem is if you work in a script, making a lot of changes in the process, the Variables window will keep on filling up with all the variables that are used in the scripts even though some variables are not even used anymore.
This causes a mess in the Variables window so you kind of lose track of the things that actually matter and are worth keeping an eye on. So, I went looking in the Options (menu Tools > Options) and I found an option that resets the debug/variables windows (Runspace) every time you start a new debugging session.
Simply select the “Reset PowerShell runspace each time debugging is started.” option, press OK and you will have a nice and tidy runspace (variables/debug window) from now on.
Some time ago someone we agreed on putting a minimum of 15 Virtual Machines on each NFS-volume on our NetApp storage. Now, almost a year later we have about 14 NFS volumes and the spreading of the VM’s on the Volumes have become a bit out of the ordinary. Mostly because we didn’t have an easy overview of how many VM’s there are present on each volume or because VM’s needed to be created on a short notice.
Since I’ve been busy with PowerShell I decided to create a nice script/utility to create this overview and to increase our insight in our NFS volumes. At first I made a console script, but soon -while I was discovering the posibilities of PowerShell- I created a GUI for it using PrimalForms.
The source is no rocket science, but I just found the utility very handy/helpful to monitor our storage environment for VMware. If you intend to use this sourcecode, don’t forget to adjust the VirtualCenter hostname and access credentials. Also, the names of our DataStores are all start with “VMWare_NFSxx” and I have a condition running to check if the DataStore name starts with that. You might want to adjust that to your datastore name convention.
Welcome to my first blog post on my own blog page. This finally gives me the opportunity to write a bit more and add a bit more imagery than I can do on just Twitter. Now that being said, let’s get to where this post is about
Microsoft finally release the first Beta of Visual Studio 2010. Now I’m not a fulltime developer, but I enjoy to develop a thing or two every now and then. Besides that I really love to see what’s new in new versions.
So I downloaded the ISO-image from Microsoft’s MSDN and I started instaling it on my Windows 7 RC installation. The installation is no other than 2008 except the only difference is that it starts installing the Beta version of the .NET 4.0 Framework which I suppose should offer a bit more features than 3.5 did.
After the installation -and the inevitable reboot- I ran Visual Studio 2010 and configured the environment for Visual C# usage, simply because that’s the language I’ll be developing in