Saturday, December 10, 2011

Browser cookies can also be created this way...

While inspecting some public websites I came across an interesting server-side-cum-JavaScript way of creating cookies. Well this must be an old technique, but nevertheless I am quite new to it.

The server-side code is the easy part. Because I assume people would have done this at sometime in their software development lifetime. If not just take a look at the following codeproject article: Beginners guide to ASP.NET Cookies

The JavaScript side of how things are to be done, I saw a code something like this:

var img = new Image();
img.src = 'http://yourdomain.com/CreateCookie.aspx';

That's it. This particular code of javascript when executed will create the cookie, provided the url in the above example returns a response that set the browser cookie. The particular image object isn't even added to the DOM, but the cookie is created and associated with the page!

The main advantage is, you don't have to have the hassle of creating cookies via javascript; especially cookies which belong to other domains (different from your website). In those websites we were supporting the main rationale was reuse, and maintainability. So they added similar code in a separate js file and added it to the page manually. When they didn't want it they'd manually remove it and edit the page. They didn't have the need to make any change to server side code to add/remove those cookies.

Thursday, November 17, 2011

a javascript based menu system

I have been working on a javascript based drop down menu system. Its kind of like the menu system on your desktop word processor or browser. Functionality wise - its much more primitive than its desktop versions.

The original inspiration came from a web application which had the looks of a windows forms application - the same form surface, buttons, etc, and of course even the menus to a degree. It was being used at my client's end. It is a very old application; probably dated back to the time where DHTML was hip. The problem with it obviously was: it contained very obsolete markup. Those sort of markups only work on old versions of IE (< 5.0). And moreover, the damn thing wouldn't render on any other browser.

But I really liked the effort put into the development of the various web pages. It was developed by a different vendor; we had to maintain the application - as usual we had no idea who that vendor was, and what that vendor had done.


I am not going to really explain everything in detail. The code is simple, and the design is pretty straight forward. I am just embedding the example here:

Saturday, October 22, 2011

Managing your application settings in .net programs

Lets say you are writing code for your data layer. Ideally you'd be creating all the types this layer has to deal with, and most importantly, how these types get instantiated with data from your database. Hence there is always a connection string involved. But have you noticed where the connection string is stored?

For n-tier applications we mostly build the data layer as a dll (or a class library project), and finally add it to the presentation tier (just saying). However while you were doing development you would have used the dataset designer (or any similar designer) which allows you to manage your connection string for you. Our connection string, if you are using dataset designers, quietly becomes an application setting. But all of this is probably stored temporarily in your class project's app.config.

When the dll is added to another project, say a asp.net application, the dll will actually look for its configuration in the asp.net application's web.config. But its not there? One thing to note: simply adding the dll, VS doesn't copy its settings and merge it with the target application's config.

A strange fact to make note of again: the dll may not have its corresponding configuration. However this doesn't mean the application settings are completely absent. When we build the dll, the configuration at that instant, is actually written into the dll. Hence if we are trying to get the value of the connection string, we might get the string which belonged to the test db server. But anyhow, in production, we'd still want to change the setting value at some point or another.

This is quite possible in the target application config file again. In the sample solution I have shared below I have created a class library project which has an application setting. I have created a simple class which reads the application setting and returns this. I have directly outputted this value in ConsoleProjectA. You'll find that the display has the same string - this was the string we have set while making the class library project. In the 2nd console application project, the same code produces a different value. If you study the app.config of this project you'll understand how to change the application settings of the dll.

The zip file which has the sample solution which demonstrates this concept can be accessed here.



Monday, August 15, 2011

Writing your own twitter share button

This is a simple exercise thanks to twitter web intents. All you need is a browser that supports javascript.

The basic url looks like

https://twitter.com/intent/tweet?text=<your_text>&url=<url_of_page>

The rest is upto javascript. For I simply assign the title of the page, and for the I simply assign window.location.href. Of course I have to properly escape both the parameters into the url. Feed the complete url to a call to window.open(). Wrap all of this in an anonymous function and execute it immediately - in the browser's address bar!

You can save this script as a bookmark, and access it anytime. In fact, you can drag the below link to your favorites bar (on IE) or bookmarks bar on (FF or Google Chrome).

Share page

There are a host of other parameters for other various ways in which you can tweet, but I am stopping short here. You can explore the documentation for learn more.

Ps: These things won't work inside an IFRAME

Monday, July 4, 2011

Getting headers from a web service

When we add a web service to a project Visual Studio automatically generates proxy classes for us. This was something very unfortunate for me. It doesn't give me header information. So I can't know any header information sent from the server to the client. I had the idea of attaching machine name (of the web server) in the response, and reading this from the client (which is my asp.net application). This would have been a simple task if the web service was exposed as a script service, and, that service was being called by AJAX call.

The problem at hand

The proxy classes generated derive from the SoapHttpClientProtocol class. The derived (web service) class has a lot of functionality built into it which saves us the headache of communicating with XML web services. But, the web service class doesn't expose any way to obtain the header information.

Solution

Fortunately you can derive the web service class and obtain a reference to the WebResponse object which encapsulates the response of the asmx web method which was executed over the internet. All you have to do is inherit the web service class, override the GetWebResponse method, and, make use of this in code.
Imports WebSvcHeaders.TimeService
Imports System.Net
Public Class TimeSvcProxy
    Inherits SimpleWebService

    Private _WebResponse As WebResponse

    Protected Overrides Function GetWebResponse(ByVal request As System.Net.WebRequest) As System.Net.WebResponse
        _WebResponse = MyBase.GetWebResponse(request)
        Return _WebResponse
    End Function

    Function GetResponseHeader(ByVal HeaderKey As String) As String
        Return _WebResponse.Headers(HeaderKey)
    End Function

End Class

Why is this useful?

You can tag diagnostic information in the response headers and then read that information from the client and log it on the client itself. It can be of major help while trying to analyze performance problem with asp.net websites deployed in a web farm, for e.g. We don't know which server in a web farm might service the web method call. Hence tagging, for e.g., the machines name in the response headers and being able to read that from the client gives me a lead to do investigation on the server which processed the call to the web method.

I suppose you can do something similar with WCF web services too. But I don't know if it can be done exactly on the same lines like I've mentioned in my post.

Tuesday, April 26, 2011

Hosting from home: Part 2

This post seems to be making a comeback after a long time. It is a followup of this post, where I experimented with hosting my webserver over the internet. Here I am going to try to explain the things I did so I can serve a page off my computer, but access that page from my a laptop running windows vista. This exercise sort of kind of gave me an idea how web hosting providers work. I hosted a page of my apache server. Apache was the web server of my choice. Very complicated to do stuff in it. But you have to take care of a few things:
  1. You have to provide a server name configuration to your website:
    Till date I don't understand this feature, but all I know is if requests come in with the host address saying this, then apache will not what website it belongs to and will process
  2. Apache should be listening to a particular port:
    Usually any web server by default will listen to 8080. This is where host http traffic will enter/exit the system. If you are using anything other than 8080 you'd have to specifically mention it in the url via the browser
  3. Make note of the IP address:
    Your router will assign an IP address unique to your machine. It is this IP address you are going to make public
That said lets go to the windows machine. All you have to do is add an entry into your hosts file (%system32%\drivers\etc\hosts). It has to be in the form: [web site domain name] [IP address] E.g. deostroll.media.com 192.162.2.4 Once this is done and the config is saved. You can startup internet explorer type in the website page giving the domain name you've entered above, and you are good. I've leave you to figure how websites over the internet now work.