Thursday, August 31, 2006

How to Cloak Page Content from Seach Engine Spiders

Cloaking is the term used to describe showing different content depending on who your visitor is. Don't confuse cloaking with hidden text; hidden text is hiding it from view, cloaking is not even loading it into the page.

There are some legitimate uses of cloaking, like disabling features that might not be cross browser compatible, or showing different news headlines based on the geo-location of the visitor to your web site.

The most reknowned use of cloaking, however, is in relation to attempts to manipulate a web sites ranking in search engine results by showing the search engine spiders one thing (usually highly optimized keyword content) and showing visitors something else. Webmasters might add or not show content in the hopes that their site will rank better in organic results.

Here's a fairly simple way to cloak content so it gets seen by search engine spiders but not by human visitors. You might, for example, place this snippet at the bottom of the page. Or you can see my post in hiding web site content and put the cloaked content in a div that is styled to keep it hidden from view:
<%
Dim strUA, vShowIt

strUA = LCase(request.servervariables("HTTP_USER_AGENT"))

vShowIt = 1
If InStr(strUA,"google") <> 0 Then vShowIt = 0
If InStr(strUA,"msnbot") <> 0 Then vShowIt = 0
If InStr(strUA,"yahoo") <> 0 Then vShowIt = 0
If InStr(strUA,"inktomi") <> 0 Then vShowIt = 0
If InStr(strUA,"snapbot") <> 0 Then vShowIt = 0
If InStr(strUA,"irlbot") <> 0 Then vShowIt = 0
If InStr(strUA,"turnitinbot") <> 0 Then vShowIt = 0
If InStr(strUA,"cjnetwork") <> 0 Then vShowIt = 0
If InStr(strUA,"myfamilybot") <> 0 Then vShowIt = 0
If InStr(strUA,"geniebot") <> 0 Then vShowIt = 0
If InStr(strUA,"wiki") <> 0 Then vShowIt = 0

If vShowIt = 0 Then
%>
Cloaked Content Here
<%
End If
%>
So if vShowIt = 0, that means the visitor was one of the bots you checked for, and you want to show the content. You could reverse it as welll by checking to see if vShowIt = 1, you could hide content from search engine spiders. This could be useful for putting links on apage that you want human visitors to see, but not search engines.

A more advanced way of cloaking is to cloak based on IP. There are service you can subscribe to that will provide you wil updated lists of all the IPs search engine spiders originate from. Cloaking based on those IPs is more reliable than simply checking the User Agent name.

Now you're probably asking, "How can someone detect cloaked content?"

The search engine sees the cloaked content, so if the webmaster didn't hide it as well, you can see the cloaked content when you view a cached snapshot of the page. This is why some webmasters use the "no-cache" parameter in the meta tags to keep the page from being cached.

Another way is to use a browser that lets you impersonate a different User Agent. WANNABrowser lets you do just that. Type Google or MSN or Yahoo in the HTTP User Agent field and then the url of the page you want to view.

Cloaking had it's heyday when page content was the main driver of search engine rank. Although it can still be useful, for both legitimate and shady purposes, it's not nearly the tool it used to be for driving major ranking changes in search engine results.

Wednesday, August 30, 2006

Concatenating Dates in SQL

I was working on a database where the original developer had stored dates in an Integer typed column in the format of YYYYMMDD, so today would look like 20060830.

This means whenever I want to filter or do something by date, I need to convert to that format. No biggie.

But I need to write a SQL statement for a stored procedure that would go through the records in one of the tables and set an Active flag to 0 (off) for evey record where the Date_End was earlier than today's date.

In SQL Server you get the date with GetDate(). You can break out the Year, Month and Day respectively by using Year(GetDate()), Month(GetDate()), and Day(GetDate()). I found out when I tried to to concatenate these that they end up adding together. So:
Year(GetDate())+Month(GetDate())+Day(GetDate())
Ended up as 2044 (2006 + 8 + 30).

In order to join them (not add them) I had to convert them to character strings. But that brought up another issue: if the month or the day was less than 10, it ended up as a single digit. I needed it to be preceeded by a 0 if it was less than 10.

My Stored procedure ended up looking like:
UPDATE Table
SET Active = 0
WHERE
Date_End <
CAST(Year(getDate()) AS varchar(4))+
RIGHT('0' + CAST(Month(getDate()) AS varchar(3)), 2)+
RIGHT('0' + CAST(Day(getDate()) AS varchar(3)), 2)
Broken down, here's how it works:

1. CAST the Year as a 4 character varchar

2. Put a 0 in front of the month, cast the Month as a 2 character varchar, then grab the right 2 digits

3. Put a 0 in front of the day, cast the Day as a 3 character varchar, then grab the right 2 digits

I set the stored procedure to execute evey night just after midnight (server time).

Scraping Page Content from a Remote Site in ASP

Scraping page content (called consuming when you do it to an RSS feed) is a way of getting content from a site other than your own and then displaying it on your own site (typically as your own content). Since the scraping is performed by the server, not the client, it has the appearance - to both site visitors and search engine spiders alike - of having originated from YOUR site, not the one you scraped from.

Microsoft's suite of XML DOM (Document Object Model) components includes the XMLHTTP object. This object was originally designed to provide client-side access to XML documents on remote servers through the HTTP protocol. It exposes a simple API which allows you to send requests (even POSTS) and get the resultant XML, HTML or binary data.

The code below shows how to return the HTML of a remote URL. Note there are two options for instaniating the object: one uses Microsoft.XMLHTTP and the other MSXML2.ServerXMLHTM. The second is the newer version of the object. Try using the older one first and if that one doesn't work, try the newer.

Since this functon returns served content, you can use it for any type of page: asp, aspx, php, html, htm... etc. Built as a function, you could go with something like:

strURL = "http://www.TheSite.com/ThePage.aspx"
strRemoteContent = GetRemoteContent(URL)

Function GetRemoteContent(TheURL)
'create an instance of the MS XMLhttp component.
'this is the old version, use new if you can
'Set xmlObj = Server.CreateObject("Microsoft.XMLHTTP")

'new version, better
Set xmlObj = Server.CreateObject("MSXML2.ServerXMLHTTP")

'Open the connection and send the request
'Set the optional Async parameter to True
'Otherwise, the waitForResponse method used will have no effect
xmlObj.Open "GET", url, true
Call xmlObj.Send()

'Turn off error handling
On Error Resume Next

'Wait for up to 3 seconds if we've not gotten the data yet
If xmlObj.readyState <> 4 Then xml.waitForResponse 3
'Did an error occur? If so, use a default value for our data
If Err.Number <> 0 Then
GetRemoteContent = "There was an error retreiving the remote page"
Else
'If we reach here, we know the server responded
'To accommodate for unexpected behaviors ensure the
'readyState property equals 4
'and the Status property, which returns the HTTP Response status,
'equals 200
If (xmlObj.readyState <> 4) Or (xml.Status <> 200) Then
'Abort the request
xmlObj.Abort
GetRemoteContent = "Problem communicating with remote server..."
Else
GetRemoteContent = xmlObj.ResponseText
End If
End If
End Function
Now the thing you have to remember is that this brings back the entire page, so if there's content on the page you want to use, you should parse it out. At a minimum, you probably only want the content between the opening and closing BODY tags. YOu could create a funciton to pull that out and display it:

response.write(ParseContent(strRemoteContent))
Function ParseContent(TheContent)
intStart = INSTR(LCASE(strRemoteContent,"<body>")) + 6
intEnd = INSTR(LCASE(strRemoteContent,"</body>"))
intLength = intEnd - intStart
ParseContent = MID(strRemoteContent, intStart, intLength))
End Function
You can do more parsing inside the funciton or extract other content by using the various string functions of VBScript. You could have different functions that pull out different sections of teh remote page for display.

Although page scraping might seem to be more for content stuffing or other less than savory ends, there are many legitmate uses as well: grab the newest news headlines or stock prices, do a price check, see if a page has been updated.... Combined with an XSLT style sheet, it can be used to pull in RSS feeds too.

Be aware that this only works on Windows servers and that that some hosting companies disable the XMLHTTP object. If you're going to build a site that makes heavy use, check with your hosting company first to make sure they have it enabled.

Tuesday, August 29, 2006

Generating a Random Number in ASP

Here's a function to generate a Random Number. Simply feed it the top end value, so to generate a random number from 1 - 10, you call it with:

RandomNumber(10)

Function RandomNumber(intHighestNumber)
Randomize
RandomNumber = Round(Rnd * intHighestNumber) + 1
End Function

What is you want to generate a random number between two other numbers, like between 20 and 40?

RandomNumber(20,40)

Function RandomNumber(LowNumber, HighNumber)
RANDOMIZE
RandomNumber = Round((HighNumber - LowNumber + 1) * Rnd + LowNumber)
End Function

SQL Server INSTR Equivalent

Access Database queries support the use of INSTR to search inside a datafield for a series of characters - INSTR(DataField, String):

INSTR(DataField, 'y')

In the above example, if the value in DataField was 'yellow', the returned value would be a 1, as 'y' is the first character of the datafield. Using the same value (yellow), INSTR(DataField, 'o') would return a 5.

INSTR, however, doesn't work in SQL Server. SQL Server does, however, have an equivalent - the CHARINDEX - the only difference being that that order of the parameters is reversed - CHARINDEX(String, DataField):

CHARINDEX('y', DataField)

Monday, August 28, 2006

Returning the Length of an Text or nText Field

In SQL Server, if you try len(FieldName) to return the number of characters in a field and the filed is a type of Text or nText, you get an error:

Argument data type text is invalid for argument 1 of len function

The answer to this is the Datalength() function which will return the length of any expression. This can be used on all data types including text, ntext, image and varbinary.

It returns the actual number of bytes in the field.

Retrieving a Random Record from a Database

I was working on one of my new sites over the weekend and needed to retrieve a random record from one of the tables in my database. A few minutes of searching revealed several solutions.

Here's the SQL Server 2000 solution I ended up going with, although it might not work if you aren't using an integer as the record ID:

SELECT TOP 1 *
FROM YourTable
ORDER BY NEWID()


I found one to use if you're using IDENTITY as a unique idetifier for the record id, but I didn't test it since I'm using n auto-incrementing integer:

SELECT TOP 1 *
FROM YourTable
ORDER BY RAND((1000*IDColumn)*DATEPART(millisecond, GETDATE()))


If you're using Access, here's one (although it's not nearly so elegant). This one requires some VBScript to generate the random number seed:

<%
Randomize()
randNum = (CInt(1000 * Rnd) + 1) * -1

set conn = CreateObject("ADODB.Connection")

sql = "SELECT TOP 1 cols," & _
"r = Rnd(" & randNum & ")" & _
"FROM TableName " & _
"ORDER BY r"

set rs = conn.execute(sql)

response.write rs(0)

' ...
rs.close: set rs = nothing
conn.close: set conn = nothing
%>

SEO and the Title Tag

The Title Tag not only communicates the theme of your web page to the human visitors but is also considered very important by the Search Engine crawlers. The Title Tag is the most important of all Tags. Almost all crawler based search engines use the Title Tag to gather information about the page. Search engines use your Title Tag to evaluate the page's relevance to its content, inbound links, outbound links, alt tags, and a host of other factors. A carefully constructed Title Tag can have a large positive impact on your page's ranking with the search engines.

In addition, the Title Tag is often the hyperlinked text title that is displayed in the search engine results page. This is the hyperlink a user clicks on to go to your web site. The Title Tag is also used as the text when you ‘bookmark' a page or add a certain web page to your ‘favorites' list in your browser.

Since the Title Tag plays a vital role in your site's ranking, you need to pay a lot of attention to the words that appear in the Title Tag and the order in which they appear. Put your important keywords at the beginning of the Title Tag. This can have the added benefit of making those words appear in bold in the search engines result pages. Develop a crisp Title Tag that includes your most relevant keyword phrases for that page. The keywords in the Title Tag are given a high value when it comes to the search engine trying to figure out what your page is about.

It's important to be highly focused. You should use the same keywords not just in your Title Tag, but also in your page content, Meta Description, and Meta Keywords Tags as well. If the keywords in your Title Tag don't appear in the page content, then avoid using them.

Specific Resources

From Google's Guidelines for Webmasters:

Design and Content Guidelines:

  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

  • Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.

  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

  • Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images.

  • Make sure that your TITLE and ALT tags are descriptive and accurate.

  • Check for broken links and correct HTML.

  • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

  • Keep the links on a given page to a reasonable number (fewer than 100).



From MSN Webmaster Help

About your site description

As the MSN Search web crawler MSNBot crawls your site, it analyzes the content on indexed pages and generates keywords to associate with each page. Then MSNBot extracts page content that is highly relevant to the keywords (often sentence segments that contain keywords or information in the description meta tag) and constructs the site description displayed in search results. The page title and URL are also extracted and displayed in search results.

From Yahoo Search Help

Yahoo! Search ranks results according to their relevance to a particular query by analyzing the web page text, title and description accuracy as well as its source , associated links, and other unique document characteristics.


What Is a Title Tag?

The title tag is one of the most important factors in achieving high search engine rankings.

A title tag is essentially an HTML code snippet that creates the words that appear in the top bar of your Web browser.

The title tag belongs in the section of your source code, and is generally followed by your Meta description and Meta keywords tags. The order of these tags is not critical, so don't worry if your HTML editor places them in a different position.

Some Web site design tools and content management systems (CMS) automatically generate the title tag from information you provide. You may have noticed Web pages that are labeled "Page 1," "Page 2," or "Home Page" in the browser title bar. You'll often see titles like these being used by beginning Web site designers who simply don't know how to use their software or their title tag for maximum benefit.

Sunday, August 27, 2006

Alternating Row Colors

When displaying tabular data, it's often useful to alternate row colors to help differentiate the different rows.

So you've opened your recordset and have retreived an unknown number of rows, and now you want to display them. The following ASP code will build a table with alternating row colors:

<table>
<%
'set your counter to 0
i = 0

'loop through the record set
Do While Not rs.EOF
'increment the counter; I always do this 1st; no reason though
i = i + 1

'set background color to nothing
vBGColor = "white"

'check your counter; if alternate row set background color to gray
If i mod 2 = 1 then vBGColor = "gray"

'write the row, td and data
response.write("<tr bgcolor="&vBGColor&">")
response.write("<td>")
response.write(data goes here)
response.write("</td>")
response.write("</tr>")

'move to the next record
rs.movenext
Loop
%>
</table>
You could easily use more than 2 colors by changing and adding to the mod statement:
vBGColor="white"
If i mod 4 = 1 then vBGColor = "gray"
If i mod 4 = 2 then vBGColor = "yellow"
If i mod 4 = 2 then vBGColor = "beige"
You can use hex codes for the colors - you don't have to use names.

Here's a handy color chart with both names and hex codes.

Hiding Page Content

Suppose a circumstance arises where - for whatever reason - you want to hide page content from human users but not search engine spiders.

The following code uses CSS to create two divs and then puts one div behind the other by using Z-Index. Human visitors to your site see the top div, search engine spiders see both. Be warned: users surfing in FireFox with CSS turned off will see all content as well.


<html>
<head>
<style type="text/css">
body
{
margin: 0px 0px 0px 0px;
}

#absdiv1
{
z-index: 5;
position: absolute;
width: 100%;
height: 75px;
background-color: #ffffff;
}

#reldiv2
{
z-index: 1;
position: relative;
width: 300px;
height: 75px;
}
</style>
</head>

<body>
<TABLE cellpadding=5 cellspacing=0 width="100%">
<TR>
<TD>
<div id="absdiv1">
This could the div with your header stuff in it,
or your logo in the upper left
</div>

<div id="reldiv2">
You can put all kinds of stuff in this div and no
one can see it; links, keywords, even images
</div>
</TD>
</TR>
</TABLE>
</body>
</html>

Friday, August 25, 2006

Redirects in ASP

Redirecting a user to a new web page or web site is a common task many webmaster will need to accomplish sooner or later. There are some redirects that work better than other, however, when it comes to maintaining search engine position.

I don't recommend using JavaScript or Meta Tag refresh redirects as you can't send a 301 status code with either of these methods. JavaScript and Meta Tag rredirects have long been exploited by less than honest web masters who are trying to game the system. Instead use a server side redirect programmed in ASP.

Most ASP programmers are familiar with:

<%
strURL = "http://www.Google.com"
Response.Redirect(strURL)
%>

Where URL is a variaable contina the location of the new page or site.

This command performs a server-side redirect to the new page, but it returns a Status code of 203, which means "temporarily moved". This tells the search engine that although it is being directed to a new location, that the move is only temporary. Hence the search engine will not remove the old page and replace it with the new one.

You need to tell the search engine spider that the page has move <i>permanently</i>:

<%
strURL = "http://www.Google.com"
Response.Status="301 Moved
Permanently"
Response.AddHeader "Location", strURL"
%>

This would tell the search engine spider that the original page has been "moved permanently" to the new location. Your old page will drop out of hte the results and be replaced with the new. Eventually....

I've never used redirects in other development languages, but here they are if you need them. I've included JavaScript and and Meta Tag refresh methods as well, although I recommend agaisnt using them:

HTTP 301 Redirect in PHP
<?php
// Permanent redirection
header("HTTP/1.1 301 Moved
Permanently");
header("Location: http://www.somacon.com/");
exit();
?>

HTTP 301 Redirect in ColdFusion
<CFHEADER statuscode="301" statustext="Moved Permanently">
<CFHEADER name="Location" value="http://www.somacon.com/">
JavaScript
<html>
<head>
<script
type="text/javascript">
window.location.href='http://www.somacon.com/';
</script>
</head>
<body>
This page has moved to
<a
href="http://somacon.com/">http://somacon.com/</a>
</body>
</html>
Redirection with META Refresh
<html>
<head>
<meta
http-equiv="refresh" content="0;url=http://www.somacon.com/">
</head>
<body>
This page has moved to
<a
href="http://somacon.com/">http://somacon.com/</a>
</body>
</html>

Thursday, August 24, 2006

What is Affiliate Marketing?

What is affiliate marketing? I get this question a lot. Simply put, affiliate marketing is selling someone else's stuff and then getting a cut. It's a pay-per-action model, where the affiliate marketer gets a percentage of each sale (pay-per-sale) or a fee for every sign up (pay-per-registration) or lead (pay-per-lead) he or she generates.

So www.Widgets.com sells widgets. You happen to collect widgets as a hobby. You build a site about your widget collection and on that site you have an affiliate link to www.Widgets.com. Whenever anyone clicks through that link and buys a Widget, you get a percentage of the sale. Sounds easy, doesn't it?

Some companies run their affiliate programs in-house (eVitamins for example). Others use affiliate management companies. The big four affiliate management companies are:
There are numerous smaller companies as well, but I haven't worked with any of them.

I'm also an affiliate of Affiliate Fuel. Affiliate Fuel is a pay-per-lead and pay-per-registration outfit.

The nice thing about affiliate marketing? Anyone can do it. All you need is a computer. Depending on the product or service you are marketing, you might not even need a web site. The lead and registration generation I do for Affiliate Fuel is exclusively through Pay-Per-Click (PPC) marketing, mainly Google AdWords, and to a lesser extent Yahoo Search Marketing (previously known as Overture).

Personally, and based on my experience since launching my first affiliate marketing effort in July of 2000, I think it has gotten much harder to make money in affiliate marketing. There's a more competition, the search engines are much less tolerant of affiliate marketing sites and links, new rules prevent bidding on trademarks in PPC, advances in spyware and mal-ware, a general decrease in commissions... all of this and more has made it difficult to start a new site, drive traffic to it, and sell stuff for someone else.

Nevertheless, if you are looking for a way to potentially make some money on the Internet in your spare time, affiliate marketing remains an option worth investigating.

Meet the Retro Web Dev Guy

Welcome to Retro Web Dev, where I'll be posting about the things I learn as I move through various web development projects, some personal and some work related.

I'm employed as a Lead Software Engineer, but I'll readily admit that modern web development technology has left me in the dust. I'm a self-taught programmer (with a Liberal Arts degree earned in 1987 no less) who started with HTML back in 1999. I taught myself ASP and database programming by building my first web site, SFReader, a science fiction, fantasy, and horror book review site with an MS Access backend (since converted to SQL Server). SFReader is still going strong, now boasting an average 500 visitors and 3,000 page views a day. There are over 700 book reviews and an active forum with 600+ members and almost 20,000 posts as of my writing this.

So far, I've yet to move into .NET. I've never dealt with PHP, Apache, or Linux, so my knowledge there doesn't extend past knowing what LAMP stands for (Linus, Apache, MySQL, PHP). I know enough about JavaScript to download something from Dynamic Drive and lightly customize it if need be.

As the 'Lead Software Engineer', I manage eight .NET web and database application developers (also called Software Engineers) and three webmasters (who work in HTML, CSS, and keep content up to date). All of the developers have been programmers longer than I have and are better at it. Much. We work in a Windows 2003 environment in .NET (both VB and C#).

So how did a reto dev guy with limited skills get the lead slot? Have you ever had to work with programmers? Programmers are an odd sort. Most (not all, but most) have what might be termed 'limited social skills'. They don't relate well to people (and might not even particularly like people). They don't tolerate sitting around in meetings 'wasting time' when they could be programming. They don't like explaining things to customers or Project Managers. They aren't good at putting together PowerPoint slides and standing in front of a group of people and making presentations.

This is where I come in. See, I'm good at all that stuff. I even kind of like it. I'm the smooth operator. I unruffle feathers. I translate geek-speak into English and vice versa. I understand development theory, processes and capability and am good at explaining them. So you can kind of consider me the 'human interface' to the programmers.

I spend about 75% - 80% of my time engaged in 'lead' type stuff: getting beat up in meetings, getting beat up about project plans, getting beat up over level of effort estimates, getting beat up by customers, getting beat up by Project Managers, getting beat up over release dates... you get the picture. Yet despite all the blows aimed my way, I'm remarkably good at coming away unscathed and with the deveopment section looking good. My bosses like that.

What little development I do at work is limited to legacy applications coded in classic ASP (bug fixes, enhancements, and the occassional new project) and some SQL Server 2000 stuff: views, triggers, stored procedures, database design, etc.

As a side business, I do affiliate marketing. I have a server at my house (Windows 2003 running on a dual processor Xeon with 3 gigs and 2 WD Raptors in a RAID 1) with SQL Server 2000 Standard installed, connected to the Internet by a business class cable connection (2M down and 768K up). On that box I'm currently hosting 50+ web sites. A very few are sites for friends, but most are my own. I developed and host niche price comparison sites, with 1.2 million products coming from the 40+ datafeeds I receive from numerous vendors like Overstock.com, SmartBargains, Shop NBC, Macy's, Walmart, eBags, Sports Authoriy and many more.

I'm constantly tinkering and trying new things. At their peak (last December), I earned over $10,000 in one month. Since then, various changes to search engine algorithms have caused my sites to fall off the face of search engine results. I'm not doing any sneaky stuff, but when you are pushing products it's gotten very hard to compete with the Targets, Amazons, Walmarts, and eBays of the world. This month, I'll be lucky to clear $400.

As a result, I've been trying more and varied types of sites. SFReader continues to grow. I've recently been putting more work into SFWatcher, SFReader's long-neglected sister site dedicated to science fiction, fantasy and horror movies.

In August of last year, I launched Free Article Headquarters, a site for people to publish topical articles for other people to use as web content.

I'm about 90% through with my newest site, but I'll refrain from writing about that until it's live.

Over the years, I've learned a lot about ASP, CSS, HTML, Search Engine Optimization, and Internet Marketing. I'll be posting useful bits and pieces about said topics on this blog, as well as any other interesting tidbits I discover.

Wednesday, August 23, 2006

RetroWebDev SiteMap

Add Widget to Header | Add Widget to Post in Blogger
Send Email using ASP from a Web Form in GoDaddy
RetroWebDev's No Joke Guide to Making Money on the Internet - Part 1: Introduction
Turn Off Comment Section in WordPress Pages
Taking the eCommerce Plunge
Operation must use an updateable query
Installing Visual Web Developer 2008
Showing the Date and Time in .Net
Visual Web Developer 2005 and ASPNETDB.DBF
RetroWebDev Blog Update & Social Bookmarking Tool
The Inevitable Move to .Net
Kittens Born
LinkSys Routers, PASV FTP, adn Directory Listing Problems
LinkRotatr - Show Random Links on your Site
The Development Lab
Customers Don't Care
War on Drugs
Cool T-Shirts
JavaScript - Close Window and Refresh Parent
JavaScript Open New Window
New Stuff
Update After Long Time, No Post
Life Code
RetroWebDev Update
GoDaddy Coupon Codes and Updates
Make a DIV Transparent
Selecting by Date in SQL Server
New Web Endeavor: Bloggertizer.com
Handy-Dandy Full Text Index Stored Procedures
How to be Successful in Business and Life
Welcome to the Dev Shop
Muay Thai Developer
So we're using Agile Development, right?
My Newest Web Site: Tizags, a Social Bookmarking site
How You Know When You're Dealing with Technically Ignorant People
How to Position a DIV at the Bottom of the Page
How to Stop a Framebreaker Script
I Got the Monday Working Song Blues, Vol. 4
Retro Sneakers
I Got the Monday Working Song Blues, Vol. 3
SQL Server 2000 - Get Most Recent Record ID
ASP "Save As" Dialog
I Got the Monday Working Song Blues, Vol. 2
Control your tard!
Generate a Random Letter in ASP
I Got the Monday Working Song Blues, Vol. 1
More on Microsoft and Web Development
AddCaption Update
Microsoft and Web Development: In Decline and Staying that Way
Add Text to Image
Rename a SQL Server Database
Changing Table Owner in SQL Server
Money Making on the Web - Affiliate Marketing
Some of My Sites
Form Field Focus On Page Load
Free Online Keyword Research Tools
Geek Babe Monday - Jessica Alba
Counting Active Users in ASP
Errors When Trying to do Response.Redirect
Geek Babe Monday - Erin Gray
Using ASP to put the Current Date on a Page
Internet FTP - Web Based FTP Clients
How to Filter out Bad Bots in ASP with Global.asa
The Importance of Fast Loading Web Pages
Geek Babe Monday - Jolene Blalock
Changing the Way Links Look
How to Hide Email Addresses from Spammers
Geek Babe Monday - Alexa Davalos
ASP Database Connection Strings: Access and SQL Server
Sending Email in ASP Using CDONTS
Using LCase and UCase in ASP
Server.Transfer versus Response.Redirect in ASP
Geek Babe Monday - Xenia Seeberg
Improving ASP Performance Tip #1
Geek's Approach to working out Q&A
Passing Values out of a Subroutine in ASP
A Geek's Approach to Working Out
JavaScript Back Button or Link
Using the VBScript Replace() Function in ASP
Using Request.ServerVariables in ASP
Setting and Using Cookies in ASP
SQL: Select Distinct with Most Recent Date
Geek Babe Monday
Manipulating Strings in Classic ASP with Left, Mid, and Right
How to Send Email in ASP from Windows Server 2003
Black Hat SEO: How to get Inbound Links from Authority Sites
Removing SQL Server Full Text Noise Words from a Search String
Geek Babe Monday
ASP Function to Capitalize the First Letter of All Words in a Sentence
Archive Calendar for Blogger Blogs
ASP Includes Explained
SQL to get Records for Last 24 Hours using DateAdd
WWW Versus Non-WWW URLs: Dupe Content and Redirecting
Scraping Search Engines for Page Content
CSS Overflow Property
How to Strip out HTML with an ASP Reg Exp Function
How to Cloak Page Content from Seach Engine Spiders
Concatenating Dates in SQL
Scraping Page Content from a Remote Site in ASP
Generating a Random Number in ASP
SQL Server INSTR Equivalent
Returning the Length of an Text or nText Field
Retrieving a Random Record from a Database
SEO and the Title Tag
Alternating Row Colors
Hiding Page Content
Redirects in ASP
What is Affiliate Marketing?
Meet the Retro Web Dev Guy