Google
 

Sunday, April 4, 2010

LINQ to entities unncecessary casts

Using LINQ with Entity Framework can give a big productivity boost. It removes the burden of O/R mapping and writing SQL queries. But Adding this layer can come with costs.
I was writing a desktop application that uses a SQL CE database and I used LINQ to entities to talk to the database.
One of the methods was slow and I suspected that the reason can be a slow query. In normal, hand-written SQL queries, I inspect query plans to check the reason for the bad performance. So I started to log queries generated by LINQ statements using:

(myQuery as ObjectQuery).ToTraceString()

Then I took the SQL query from log file to SSMS and I started to make some enhancements. One of them was adding an index to a smallint column used in a foreign key. But I discovered that the index was not used by SQL CE. That was strange, but looking to the where statement generated I found something like:

WHERE ( CAST( [Extent1].[PageNumber] AS int)) = ( CAST( @p__linq__468 AS int))

And the index on the PageNumber column (smallint) was not used as I found from execution plan that filtering is made after a table scan (not index seek).
I suspected that the cast used in the above query is what caused the optimizer not to recognize that the index should be used, and I verified this by removing the cast and running the query in SSMS again. This time the index is being used.
I started to Google about this issue. But found that SQL Server is smart enough not to be tricked by the silly cast. But since I'm using SQL CE. I had to find a way to fix this issue.
Removing the unnecessary cast was out of my hands, so I decided to change the data type of the column to int and sacrifice some space to gain some speed. I did that and updated the model. And the where clause became:

WHERE [Extent1].[PageNumber] = ( CAST( @p__linq__468 AS int))

No casts for the PageNumber column and the index is used.

When you use a high level technology, you take it with its pros and cons. But there are no excuses, you are responsible for what you ship to your client.

Thursday, April 1, 2010

Splitting csv file based on content in one line using PowerShell

Problem:
you have a csv file that contains department employees in a format like this:
Department,Employee
Sales,emp1
HR,emp2
Sales,emp3
Finance,emp4
Finance,emp5
Security,emp6
Security,emp7
Security,emp8
HR,emp9
And you need to split this file contents to separate files based on department name. So for the above example, we should get four files, Sales.csv, HR.csv, Finance.csv, and Security.csv. Each file contains only it's employees.
And the solution is really shows the power of PowerShell pipelining:

Import-Csv file.csv | Group-Object -Property "department" | Foreach-Object {$path=$_.name+".csv" ; $_.group | Export-Csv -Path $path -NoTypeInformation}

Dissecting the above commands:
Import-Csv file.csv:
Parses the csv file and returns an array of objects.

| Group-Object -Property "department":
Since we need to split by department, it makes sense to group objects by the department property.

| Foreach-Object {...}:
We need to apply an action for each group (department). So we pipeline the resulted groups to Foreach-Object

$path=$_.name+".csv":
Within the foreach, we need to create a temporary variable ($path) to be passed to the next pipeline responsible for the actual saving. Note that I use the semicolon ";" to separate this part from the next. And I used the name property of the group (which maps to department name in our case) to format the file name.

$_.group | Export-Csv -Path $path -NoTypeInformation
:
Then for each group we have, we need to export its contents (csv file rows) to the file path created in the past step. So we again pipeline the group property of the group item (which is an ArrayList of original objects) to the Export-CSV Cmdlt.

And the result should be files like:
Finance.csv:
"Department","Employee"
"Finance","emp4"
"Finance","emp5"

Mounting file systems on Ubuntu

I'm not an experienced Linux user. But I use Ubuntu (after trying redhat and Fedora) since it's supposed to be (Linux for human beings).
I have both Vista yes :( and Ubuntu on the same machine. And I regularly need to access ntfs file system with windows files when using Ubuntu.
When I open the file system from nautilus, I get this message, and I have to enter the root password.

This is very annoying. And although the prompting for the password can be suppressed (but this is another story). The real issue is that the file system is not automatically mounted when the system starts up. This means that when using an application that needs to access the hard drive (virtual box for example). I have to open the drive from nautilus first.
So the solution is to edit the /etc/fstab file that contains the information needed to mount volumes on startup.
This is OK, I opened a terminal, and ran sudo -i to run as root then made a folder to mount the volume under: mkdir /mount/DriveName
then gedit and opened /etc/fstab/ and added:

/dev/sda1 /media/DriveName ntfs rw,nosuid,nodev,allow_other,default_permissions,blksize=4096 0 0

restarted and I could access the file system without prompting for password. Then I started to test how I can access the file system (I'm a good developer and I do test my work). Everything looked OK. But when I try to delete a file:
Cannot move file to trash, do you want to delete immediately?
I checked the permissions and found that root is the owner and the group that has access. I also could write to .Trash-1000 (similar to $RECYCLE.BIN in windows)
The solution was to go back to fstab and adding the username I use as the owner:

/dev/sda1 /media/DriveName ntfs rw,nosuid,nodev,allow_other,default_permissions,blksize=4096,uid=username 0 0

And finally, I'm happy !!

Monday, March 8, 2010

Changing AJAXtoolkit:CalendarExtender day format

AJAX toolkit has a set of awesome controls, and very customizable too. But one property that is not customizable OOB is how the day names are displayed in the Calendar extender.
The displayed format is two letters only. But what if you want it to be displayed in three letters?
Here a small cosmetic surgery (i.e. code changes).










BeforeAfter



So, what does it take to do it successfully?
I opened CalendarBehavior.debug.js, which contains the debug version of the extender's JavaScript. Day names were obtained from an array named: dtf.ShortestDayNames:

dayCell.appendChild(document.createTextNode(dtf.ShortestDayNames[(i + firstDayOfWeek) % 7]));

Searching the library code, I found that this array is defined in MicrosoftAjax.debug.js. as:

ShortestDayNames":["Su","Mo","Tu","We","Th","Fr","Sa"]

And fortunately, another array was defined with three letters names:

AbbreviatedDayNames":["Sun","Mon","Tue","Wed","Thu","Fri","Sat"]

So all what it takes is to replace ShortestDayNames with AbbreviatedDayNames in both CalendarBehavior.debug.js and CalendarBehavior.js:

dayCell.appendChild(document.createTextNode(dtf.AbbreviatedDayNames[(i + firstDayOfWeek) % 7]));

And build.

It's great to have the source between your hands :)

Saturday, February 6, 2010

My Twitter account

I'm not a big fan of social networking sites. But I think twitter can be a great information sharing platform. Follow me, I don't intend to make so much noise anyway :)

Friday, January 29, 2010

Tip: turn auto close Off for you SQL Server databases

On a development machine, I had a high CPU usage for the SQL server service. To understand the reason, I looked at the SQL Server logs (From SQL Server Management Studio - SQL Server Logs).


As you may notice from the screenshot above, the activity of starting the reporting service database was recurring again and again.
I checked the database options, and the noticed that the auto close property was set true. This explains why SQL Server tries to start the database at every connection.
If the Auto close property is set to on:
the database is closed and shut down cleanly when the last user of the database exits and all processes in the database complete.
As quoted from msdn.
Unfortunately, the property was set to true when the database was created during the SQL server installation.
Turning Auto close to false (off), and maybe restarting the service, returned the CPU usage of the SQL Server service to a normal level.

Friday, January 1, 2010

Articles I read in 2009

A new year, and as a habit, I share the a list of articles I read in the past year, 2009.
It was a tough year on the personal, career and study levels, I wish I can contribute more to the community in the new year through blogging, article writing and open source software.

2008 list can be fond here
2007 list can be fond here