There's a new initiative being pushed down by the Big Heads, but first some background:
In our development shop, we each have our own development box. All the developers have local admin so they can install software and do other things developers need to do in order to be able to develop. The development machines are on the Intranet, as are our test servers, both web and database. Some of the applications we develop depend on connecting to existing applications for their data sources.
We develop, test, and deploy sandbox installs on our development servers for customers to review, repeating this processes as best we are able until we're done with whatever it is we're building.
The completed (and development tested) application then goes to Test and Integration, were the eventual users perform their test cases (which the developers wrote BTW) and the network guys make sure it will integrate into "production", i.e., that it won't screw up anything already out there.
Once they rubber stamp, it goes to Configuration Management, who versions it and then releases it to production, who installs it on the "live" Intranet.
Barring the typical development stupidity and ignorance one runs into almost anywhere, this process (eventually) works well enough for us to complete applications and get them deployed and in use.
Well now someone has decided it's a "risk" to do development on machines that are connected to the "live" Intranet. The proposed solution is an isolated Development Lab, where there's a single development "server" upon which each developer has his or her virtual machine set up to develop on. The developers will have admin permissions be able to install/uninstall software and changing settings just like now on his or her VM. There's a single development database server we are all supposed to use.
So far, no adequate answer to my questions:
How are we supposed to connect to existing web services on the Intranet if the app we're building requires it? This is the biggest concern, as many of the apps we build connect to existing services for data. The floated solution so far is to buy additional servers and stand up copies of the production systems in the lab environment that we then develop against. Brother, I'm here to tell you I can's get $3 for a ream of paper, and now they're are going to buy servers for standing up production copies? I wonder what the vendors of these systems will think of us running additional instances of their software?
How are geographically separated customers supposed to review milestone installs of work in progress? Right now, customers can log into the Intranet from anywhere and view apps in progress. In the Big Heads' Dev Lab, they won't be able to do that, so customers will have to physically travel to the Dev Lab to view.
Lastly, and this sort of ties in with the whole $3 ream of paper thing, we've been given a VM to do "proof of concept" against. We're supposed to set it up with a baseline install of all the stuff a developer needs. The VM they gave us has 18 gig disk space. Visual Studio 2005 needs 6 gig right from the git-go. VS 2008 needs even more. Plus whatever apps we're working on. Plus helper apps like Infragistics. And SQL Server Developers edition. And the OS. And, and, and... We need 80 gigs I told them. Sorry, don't have the resources. So you gong to stand up dupe servers of all the apps we develop against, but you can buy some more disks so the developers can have enough space to set up their dev environments? Yeah, that'll fly.
Anyway, just needed to vent. Processes, over time, are supposed to get simpler and more streamlined. Why is that software development always seems to go in the other direction and become more complicated?
Some Classic ASP in VBScript and SQL Server tips, flavored with rhetorical rants, commentary and other useless blather.
Thursday, August 28, 2008
Thursday, August 14, 2008
Customers Don't Care
The development shop I lead is firmly shackled to Microsoft. We write applications using .Net, mostly in C# (occasionally in VB) on the .Net 2.0 framework with Visual Studio 2005 as our IDE. Our apps are hosted on Windows Server 2003 boxes, and our backend is SQL Server 2000.
Most programmers have a language of choice, along with the tendency to denigrate the languages they didn't choose. It's no secret that certain languages are better at certain things, but mostly people tend to stick with what they know. That's why I still write some ASP after all these years.
We have a running joke around the office. Web development is easy, all you're doing is putting stuff into a database and getting stuff out. Put data in, get data out. In, out. Easy-peasy. Cake.
The company I work for went through a big push about 4 years ago to move to .Net, but when asked why, one got the typically weasel-like management answers:
Shortly after management announced all new development had to be done in .Net, they also decided some of the 'critical' applications should be redone in .Net as well. I fought that, because, well, it was stupid. But mostly I lost. So we spent a good amount of time rebuilding things in .Net that did something a previous applications not in .Net did perfectly fine. Which pretty much boiled down to putting stuff into a database and getting stuff out.
While everyone associated with application development has an opinion on which language to use, guess what? The customer couldn't care less. Oh, there are some customers who want you to use the latest and greatest so they can brag to their competition that they are using the latest and greatest (or most obscure, or most complex, or most [whatever]), but mostly, when it comes to web application development, the customer just doesn't care what language you code in.
They want:
Most programmers have a language of choice, along with the tendency to denigrate the languages they didn't choose. It's no secret that certain languages are better at certain things, but mostly people tend to stick with what they know. That's why I still write some ASP after all these years.
We have a running joke around the office. Web development is easy, all you're doing is putting stuff into a database and getting stuff out. Put data in, get data out. In, out. Easy-peasy. Cake.
The company I work for went through a big push about 4 years ago to move to .Net, but when asked why, one got the typically weasel-like management answers:
- That's the way the industry is moving
- It's more powerful
- It's faster
- It's easier
- More value added
- We have a Microsoft Enterprise license and we need to get our money's worth
Shortly after management announced all new development had to be done in .Net, they also decided some of the 'critical' applications should be redone in .Net as well. I fought that, because, well, it was stupid. But mostly I lost. So we spent a good amount of time rebuilding things in .Net that did something a previous applications not in .Net did perfectly fine. Which pretty much boiled down to putting stuff into a database and getting stuff out.
While everyone associated with application development has an opinion on which language to use, guess what? The customer couldn't care less. Oh, there are some customers who want you to use the latest and greatest so they can brag to their competition that they are using the latest and greatest (or most obscure, or most complex, or most [whatever]), but mostly, when it comes to web application development, the customer just doesn't care what language you code in.
They want:
- certain information
- to appear on certain web pages
- in response to certain actions
Subscribe to:
Posts (Atom)