12/11/10

Authentication Issues with WebHttpBinding in WCF 3.5

If you plan to expose your WCF Services via HTTP requests instead of SOAP messages, you will need to use WebHttpBinding to configure your endpoints. When you use do this and host your services in IIS 7.0, you might get the following error:

IIS Specified authentication schemes 'IntegratedWindowsAuthentication,Anonymous'
but the binding only supports specification of exactly one authentication scheme. Valid authentication schemes are Digest, Negotiate, NTLM, Basic, or Anonymous. Change the IIS settings so that only a single authentication scheme is used.

This happens if your services are inside a web application that is configured to use Anonymous as well as Integrated Windows Authentication in IIS. To get around this issue, assuming all your services are in one folder under your application root, select that folder and in the Authentication options,uncheck one of the Authentication options depending on your need. This should fix the problem.

11/14/10

SQL Server 2005 Full-Text Search : Data population failure

After creating the Full-Text Catalogs and Indexes on the SQL Server, I ran a script to populate the indexes. Since population takes a while, especially if the database is large, I decided to check on the status periodically via the FullTextCatalogProperty function in SQL Server. 

One of the properties returned by the function is the PopulationStatus property which has one of the following values:
0 = Idle
1 = Full population in progress
2 = Paused
3 = Throttled
4 = Recovering
5 = Shutdown
6 = Incremental population in progress
7 = Building index
8 = Disk is full. Paused.
9 = Change tracking

In my case it was showing 1, meaning the population was in progress. After about a couple of hours it was still showing 1 and just stayed there. When I checked the ItemCount property returned by the same function, it was showing 0, meaning no items had been indexed so far.These were clear signs that the data population process was stuck.
 
I looked at the scripts and the tables and the catalogs and indexes, but didnt find anything amiss. I then checked the logs in the SQL Install folder and noticed the following error related to Full-Text Search:
 
"No Mapping between account names and security id's were done"
 
I had no idea where this error was coming from and it took me some time before I hit upon the idea of checking the Full-Text Search Windows Service. It turned out that the Service was running under a user account Admin123 that no longer existed on the machine. I changed the Service to run under the Local System Account and everything started working!!!

10/31/10

System.Threading.Timer

I have a data population task inside a windows service that needs to start on a certain date and repeat periodically at a specified interval. The start date and the repeat interval can be specified through a xml config file.

I decided to use the Timer class in the System.Threading namespace. It has a number of  constructor overloads and the following is the one I used :

public Timer(TimerCallback callback, Object state, TimeSpan dueTime, TimeSpan period);

where:

callback -> The method that performs the task.

It must match the signature of the System.Threading.TimerCallback delegate type that has the following signature:

delegate void TimerCallback(Object state)

state -> data that you can pass to the callback method each time it is invoked.

dueTime - > the amount of time the CLR needs to wait before calling the callback method the first time.

period -> the repeat interval

The advantage of using this Timer class is that the callback method is called by a CLR thread pool thread. The Thread Pool assigns one thread from the pool for all Timer instances. This thread keeps track of all the timers and calls them at their due time.



(reference : CLR via C# - Jeffrey Richter)

10/24/10

Full-Text Search in SQL Server 2005

Full-Text search is a SQL Server feature that provides powerful options for querying text information in the database.Without this feature the querying options are pretty much limited to using the LIKE clause or using functions such as PATINDEX and CHARINDEX.

The Full-Text Search is implemented via a Full-Text Search Windows Service. This Service is NOT a part of  the SQL Server. The SQL Server interacts with this external service in order to facilitate Full-Text Search.

Following are the steps to enable Full-Text Search in a database:

  • Make a list of tables that need to be Full-Text search enabled.
  • In each of the tables, make a list of all the columns that need to be a part of the Full-Text Search index.
  • Create a Catalog, specifying a location for its creation. A Catalog is a container that holds Indexes. You can have as many Catalogs as you like and each Catalog can hold zero or more indexes. Though external to the SQL Server, the Catalogs need to be on same machine as the SQL Server. For performance reasons, Microsoft recommends creating the Catalogs in a drive separate from that of  the SQL Server installation. Microsoft also recommends creating separate Catalogs for tables that have more than one million rows of data.
  • Create one Index per table. You can have only one index per table. Also you need to specify the Catalog the index belongs to.An Index can belong to only one Catalog.
  • At this point, we only have the structure(catalogs and indexes) but no data. We need to populate(crawl) the indexes. Population is the process of SQL Server passing data from the indexed columns to the Full-Text Service, which maintains a word list letting us know what words can be found in what rows. It is a resource and memory intensive action that must be performed during off-peak hours so as to not slow down the SQL Server.Once the population is complete, the tables are Full-Text Search enabled.
  • Next, all the database objects like Stored Procedures, Functions and Views need to be modified to use the Full-Text syntax. The Full-Text syntax contains commands like CONTAINS,CONTAINSTABLE, FREETEXT, FREETEXTTABLE etc that enable us to query for information in different ways.
  • Full-Text indexes are not kept up to date like SQL Server indexes.So, if new data is added to the tables, the data will not be searchable until the indexes on those tables are re-populated.

10/23/10

Some useful debugging tools

TCP TRACE

I needed to grab and analyze a HTTP Web Request that was being made programmatically from a Windows application. My friend Chandra suggested that I look at the TCP Trace tool. It is a really cool tool that acts as a tunnel between a Client and Server.

This is how it works.When you start it, it pops a dialog as shown below:

Let us asume,the Windows application was making a HTTP Web Request to an ASP.NET web application on port 80. We need this request to be grabbed by the Tcp Trace. So in the window above, set the listening port of the trace to 80 so that it can catch the request. The request needs to be ultimately passed on to the web application which used to listen on port 80.So we change the application's port to another available port, say 8080. We then point the trace's destination port to 8080 on the same server.

After you make these changes and click ok, the Trace is ready to grab the requests coming on Port 80.Now when the windows application make a web request programmatically using the HTTPWebRequest class, the trace catches it. The trace has a visual interface(like fiddler) that enables you to look at the request in detail.

You can download it here.


DEPENDENCY WALKER

I was once integrating a third-party application into my existing application. My application started blowing up at the point of integration complaining about some missing functionality. I had all the third-party assemeblies necessary and was not able to make sense of the error messages. It then struck me that maybe the third-party assemblies are missing some dependency assemblies. I then came across a tool called Dependency Walker, which visually lists all the dependency assemblies for a specified assembly. When I loaded the third-party assembly using this tool,  it showed that I was missing two dependency assemblies. Once I got those assemblies, the application worked just fine.

You can download or find more information about the Dependency Walker here

9/9/10

String replace using Regex

I recently ran into a situation where I had to highlight a certain word in a paragraph of text. Let us say the word is "Soccer". All I was trying to do was replace all occurences of "Soccer" with "Soccer" so that the word "Soccer" would get displayed in bold.

I initially thought of using the Replace method of the string class, but then quickly realised that Replace method is case-sensitive, so words like "sOccer" or "SOCCER" or "soccER" were not going to be replaced.

After some research on the web, i came across the Replace method of the Regex class(namespace - System.Text.RegularExpressions) that had the following signature:

public string Replace(string input,MatchEvaluator evaluator)

where input is the paragraph of text in which I wish to replace the word "Soccer".

The MatchEvaluator is a delegate with the following signature:

public string MatchEvaluatorMethod(Match match)

The code is as follows:

string TextToSearch = "The Soccer world cup was held in South Africa. There were 25 SOccer teams from all oover the world. It was a real treat for SOCCER fans all around the world";

//pattern to search is "Soccer"
Regex objRegex = new Regex("Soccer",RegexOptions.IgnoreCase);
 
string HighlightedText = objRegex.Replace(TextToSearch, new MatchEvaluator(HighlightWord));

/*The replace method calls the HighlightWord method every time it finds a match for the word "Soccer" in the TextToSearch and it is NOT case-sensitive. The input parameter to this method is an instance of the Match class that represents a match for the word "Soccer". The value returned from this method is then used to replace the matching word in the original string.*/

/*The end result "HighlightedText" is the same as the original text "TextToSearch", but with all the occurences of the word "Soccer" highlighted.*/

The method HighlightWord is as shown below:


//This method returns the highlighted word
public string HighlightWord(Match objMatch)
{
   string WordToHighlight = objMatch.ToString();
   return string.Format("{0}", WordToHighlight);
}

8/22/10

Instant Cookies

Look at the two methods shown below :

private void ModifyCookies()
{
HttpCookie objCookie = Request.Cookies["Cowboy"];
if (objCookie.Values.Get("position") == null) objCookie.Values.Add("position", "Receiver");
Response.Cookies.add(objCookie);
SetPlayerProperties();
}


private void SetPlayerProperties()
{
HttpCookie objCookie = Request.Cookies["Cowboy"];
Player objPlayer = new Player();
objPlayer.Positon = objCookie.Values.Get("position");
}

In the first method ModifyCookies() we get the cookie from the HttpRequest cookies collection. Let as assume the cookie was as follows:
//Cowboy = FirstName=Miles&LastName=Austin&Location=Dallas&Age=25

We then added another key/value pair to the cookie and the modified cookie now looks like :
Cowboy = FirstName=Miles&LastName=Austin&Location=Dallas&Age=25&position=Receiver

We then add the cookie to the HttpResponse cookies collection.

We then call the second method SetPlayerProperties() in which we access the new key that we have added to the cookie above.

NOTE : At this point the response has not been sent to the client yet, but we are still able to access the new key from the Request.Cookies collection.

The point to remember is (reference : Microsoft MSDN):

After you add a cookie to the HttpResponse.Cookies collection, the cookie is immediately available in the HttpRequest.Cookies collection, even if the response has not been sent to the client.

8/15/10

When denormalization helps

I had developed a web page that displayed the log of all the invoices in a given project. There were about ten columns in the log, each one displaying a different piece of information about the invoice. These invoices were created against contracts and there is a one-many relationship between a contract and an invoice.

Two of those columns,"WorkCompletedToDate" and "MaterialPurchasedToDate" represented the amount of work done and material purchased(in dollars) prior to the current invoice on a given contract. So if we were looking at invoice No.3 on a contract X, the "WorkCompletedToDate" and "MaterialPurchasedToDate" represent the sum of the values for those fields from invoice No.1 and invoice No.2 on the same contract X.The values in these two columns were not stored in the database and were always being calculated at runtime.

There were thousands of invoices in the project and the runtime calculations for these two columns began to slow down the web page.I tried to optimize the stored procedure and functions and other code relating to the log, but nothing seemed to help.

I then hit upon the idea of denormalizing the invoices table by adding two columns to store these values, instead of calculating them at runtime. This would also require a script to populate these values for all the existing invoices in the database. After I made these changes, the page began to load real fast.

Though denormalization might not be the best approach in a lot of scenarios and must be used with caution, it does come handy in some situations, and is definetely an option that must be kept open, especially when faced with performance issues.

8/10/10

The State Design Pattern

Sometime ago I was given the task of designing a module for creating and managing work orders for one of our clients. For those of you wondering what a work order is..let me give you an example..

Let's say you live in an apartment and your air conditioning system is not working. what do you do? You immediately call up the apartment office to report it. So what you are doing is initiating a work order process. The office assistant takes your request and creates a work order for repairing the a/c. This work order is then sent to a technician who sends an estimate to the assistant. The assistant or the assistant's supervisor approves the cost estimate and the technician gets to work. The technician completes the job and reports back to the assistant. The assistant then closes the work order(after confirming with you obviously). This is a typical work order workflow process.

I had to develop a user interface to facilitate this entire process. I decided to browse through the GoF design patterns to see if I could find some pattern that would serve the purpose. I obviously ruled out the creational and structural patterns since I was not looking along those lines. So I started looking at the behavioral patterns wherein I came across the State Design Pattern. After carefully studying the pattern I decided that I was going to go with it.

Why the State Design Pattern?
The work order process goes through different stages and the behavior changes at every step. It is the same work order that appears to take on different forms at different stages. We can use a single object and keep altering it's underlying behavior with every step in the work flow process. We can tie the user interface to the underlying behavioral changes so that, as the flow changes, the UI will also change accordingly.

The UML is as follows:

















How it works:
  • The UI interacts with the State Manager which is the object that changes it's underlying behavior
  • The State Manager changes it's underlying behavior through an instance of  the Abstract State class, which happens to be one of its member fields.
  • This instance points to a different concrete instance of type State at different stages of the work order process. As this happens, it appears as if the State Manager is changing it's underlying behavior.
  • The UI also changes from step to step depending on the properties of the State Manager.

 Code:

public class WorkOrderUI: System.Web.UI.Page
{
   StateManager objStateManager;
   protected void Page_Load(object sender, EventArgs e)
  {
     objStateManager = new StateManager();
     DisplayUI();
   }
  
/*render the UI depending on the state manager's underlying state*/
private void DisplayUI()
{
    if(objStateManager.IsApproved) //do so and so
      btnApprove.visible = false;
}

  /*save handler, calls the statemanager that will change it    behavior as a result of this save*/
   protected void Save(object sender, EventArgs e)
  { 
     objStateManager.SaveState();
   }
}

public class StateManager
{
   State objState;
   public StateManager()
  {    
     objState = GetState();
  }

 private State GetState()
{
  /*depending on various properties this method returns the current state*/
   return new WorkOrderState();//example
}

public void SaveState()
{
   //saves state
    objState.SaveData();
}

}

public abstract class State
{
   protected abstract State GetData();
   protected abstract void SaveData();
}

public class WorkOrderRequest:State
{
   protected State GetData()
  {
     //implement functionality specific to this state
   }

  protected void SaveData()
  {
     //implement the save specific to this state
  }
}

other state classes are implemented in a similar manner.

8/9/10

Pros and Cons of using XML data type in SQL Server 2005

Recently I developed a web-based tool using ASP.NET and C#, that enables users to create custom forms in a matter of minutes.Every form can have any number of controls like Textboxes, Radiobuttons,Checkboxes,Labels etc.Since the form size can vary from say 5 fields to 20 fields to 50 fields,there was no way I could create a flat table structure to store this data. So I decided to use the XML data type in SQL Server 2005 to store the form fields configuration.


The rendering part of the tool parses this xml configuration(using LINQ to XML) and renders the form.Let us assume that this form is going to be used for survey purposes. The survey users can fill in the data and complete the survey. All this works just fine and the survey is successfully completed.


Then comes the reporting(SSRS). The management wants to see the results grouped and sorted in a multitude of ways. SQL Server does not have too much of an API for querying XML columns. So I had to use whatever was available(the query,value,exist,modify and nodes methods) along with some SQL cursors to come up with the desired results in the desired format.


The problem with this approach was that this parsing(which was happening at runtime) took time and the time taken was directly proportional to the number of survey instances, so as the data grew, the reporting became slower and slower. So while XML was very convenient way to store variable length data, it was not very easy to work with, in terms of reporting or analysis purposes.


I managed to solve the issue with reporting by creating a flat table structure for storing the survey data(since the configuration is constant once the form is created).I then created a SQL agent that runs as a scheduled task during the non-peak hours and performs the time consuming task of parsing instance xml data and populating the flat table. I then pointed the reports to the flat table. This speeded up the reporting process.


So while XML data type is suitable for storing variable length data, the developers need to note that it is not very easy to work with the XML, with the limited API in SQL Server. One way to solve the problem would be to get the data out of SQL and use the powerful .NET API to get the desired result. The other way would be to create SQL agent(as described above) to pre-populate data into flat tables and then work with those tables. But all said and done, the XML data type is a very powerful feature and provides tremendous flexibility. I would definetely recommend it.