comments (not for humans)
I was working with load tests in Visual Studio 2005, and wanted to create a load test reading random pages of the web site. The model used in the VS2005 load tests is scenario based, which basically means each user goes through a sequence of urls from start to end. This was not what I wanted, as this is a information system, not a process based system like a webshop. An information system has endless usage scenarios. I wanted to create a web test, where the user had to open som initial pages, and could from there go to a random page on the web site.

Mining the pages
First of all, I needed a set of links to use for my test. Xenu's link sleuth is a broken link tool, which works great for my purpose. I basically crawls the site, and allows you to export a report containing all urls.

Excluding unwanted entries
The Xenu-report contains both working, broken and external links. I only wanted to exclude the broken and external links, and I also wanted to remove information from the report that I had no use for. This can be done directly from code in VS2005, but I didn't want to include code for that in my test. I used some cygwin command line tools:
cat xenu_report.txt | grep -v "not found" | grep -v "skip external" | grep -v "timeout" |
grep -v "no object data" | grep -v "connection aborted" | grep -v "invalid response" |
grep "http://" | sed -r "s/([^ \t]+)\tok.*/\1/g" > urls.txt
This command removed all the unwanted information and all the broken, timed out or external links. I wanted to test my web application, not my web server, so I decided to also remove all css, javascript, images and documents (word, pdf) from the link list:
cat urls.txt | grep -v -E "\.(css|jpg|jpeg|gif|js|jar|pdf|ico)" > weblinks.txt
Coding a VS2005 web test
To create a coded web test, either add a class file to the solution, or click generate code on a working webtest. The code below shows my test class. Notice VS2005 nice "yield return" for the enumerator. Now that's a neat language feature.
namespace LoadTestProject
{
using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.VisualStudio.TestTools.WebTesting;
using System.IO;

public class CustomTest : WebTest
{
private static List urlList = null;
private Random rnd;
public CustomTest()
{
this.PreAuthenticate = false;
rnd = new Random();
if (urlList == null)
{
urlList = new List();
StreamReader fileReader = new StreamReader(new FileStream("weblinks.txt", FileMode.Open));
while (!fileReader.EndOfStream)
{
string line = fileReader.ReadLine();
if (!line.Equals(""))
{
urlList.Add(line);
}
}
fileReader.Close();
}
}
public override IEnumerator GetRequestEnumerator()
{
int numRequests = rnd.Next(3, 20); //How many requests should each user do?
for (int i = 0; i < numRequests; i++)
{
int page = i;
if (i > 2)
{
//After visiting the first 3 pages, go to a random page
page = rnd.Next(0, urlList.Count);
}
WebTestRequest request = new WebTestRequest(urlList[page]);
request.ThinkTime = rnd.Next(40, 80); //How long time does the user think before clicking a new link
request.ParseDependentRequest = false; //Do not download css and images
request.FollowRedirects = false;
yield return request;
}
}
}
}

Create a load test
Create a load test and set the CustomTest as you scenario Test Mix, you're done.
timvw
Have you tried MS 'Web Application Stress Tool' already?
Erlend

Re:

Yes, and I'm not very happy about it's tendency to display error messages like "Failed to load log file due to a unhandled error". Also I don't like the fact that you can't import a plain text list of urls. I tried converting the list to a IIS-log format using some command line tools, but ended up with the "Failed to load..." error message.
Noel

Monitoring

I have never tried such method of monitoring. Thank you for the information. I have been using <a href="http://www.dotcom-monitor.com/">Dotcom-monitor</a> and <a href="http://www.site24x7.net">site24x7</a>
BigBen

Repeatable?

..a test should always be repeatable, random makes it way harder to debug.

It is hard to assert anything from this test.
Erlend

Re: BigBen

Creating a repeatable load test is very hard to do in my opinion. The random element is introduced because users will not visit the site in the same order as Xenu. And also if we always load test in the same order, we may never discover the real problem. If you want to debug, you can just rerun the request logs from the IIS server to get the exact same order.
adam
Great web monitoring services provider : http://www.dotcom-monitor.com . Many of clients are using dotcom-monitor to check & monitoring their websites. It’s an advanced product that sends message very fast if something is detected on internet services . They are providing many Network Monitoring Tools, you can see here. http://www.dotcom-monitor.com/services.asp
Comments closed for this post