Monday, August 9, 2010

Faking data from log files

Our C# server team is behind schedule. Without server data it is rather hard to get a whole lot of front end work in place. I was putting in little fake files and opening them and reading strings to simulate the server call. As I was just about to do that again I thought about how stupid it was to open the log file, tear out a snippet of data and copy it to temp file again and again. Why not just read directly from the log file?

With our system you can turn on logging. It logs out each request with parameters and response from the server in plain ASCII text. You don't get binary data for pages and thumbnails but that is such a small part of the server calls I am dealing with now I can get around it. I wrote our log parser that helps you diagnose the log file for problems so I had code to steal.

I wrote a Server Log Data provider that allows you to send a message to the server by rolling through the log file looking for the same message tag with the same parameter list and it will return you the data from the file. Works really slick and is allowing me to get various singleton caches in place with some real data.

You have to use our main clients and perform all the steps, such as opening an image and selecting annotations, that you want written to the log file. You can then replay that by making same server calls, in any order, into your C# code.

I say replay but it is not really a replay of the log file like some testing tools use. Did not want that as I have not set up the code to handle all the messages as of yet. Plus this is a request / response type system so just shoving data at the application does not make much sense.

I have even found some redundant data issues caused by our server. I am adding key = value pairs to a C# Dictionary collection. You can't add duplicates. The collections we use in Java and C++ just replace the value and don't really care if you use a duplicate key. C# throws an exception. This issue has been reported to the server team. If they fix the issue it will save bandwidth so this tool is already starting to pay for itself.

No comments:

Post a Comment