Do projects matter for IT?

(Originally posted on Eric Brown’s Technology, Strategy, People & Projects blog on June 7, 2011, with some edits. Still highly relevant at the end of 2014!)

Short answer: not as much as many believe.

Information technology (IT) focus is shifting from classical projects to agile services. Here’s why.

Reason 1: Much of IT defies project definition

A classical project has a predetermined start, end, work breakdwon, and result. When done, the result goes into “maintenance mode”, and you jump to the next project.

But what if something never has a “maintenance mode”? What if a work breakdown is impossible to know?

For example, the web is never done. A university’s web site must be exciting and work quite well; the key audiences are technologically progressive prospective and current students. What university wants technophobic students? Relevant university sites must keep up with rapidly evolving consumer technologies.

A university’s web site is a good example of an agile service: an adaptive mix of agile applications and expertise. These are where a lot of IT’s attention is going.

Agile services don’t end. They are not classical projects.

Reason 2: Small projects don’t matter much

Isn’t a service just a lot of mini-projects? And isn’t the latest trend to make projects smaller?

Neither matters much. Small projects are really large tasks or iterations in an agile service.

By themselves, small projects don’t tell the value of IT. Agile services do.

Reason 3: Virtualization and clouding

The largest classical IT projects are implementations. Virtualization and especially clouding make it easier to create new things, sometimes minimizing implementation projects into simple tasks.

Without huge implementations, the focus shifts to maximizing value of existing investments. Again, emphasizing agile services at the cost of classical projects.

Reason 4: Agile is where it’s at

Classical projects use waterfall, a prescriptive method from the manufacturing and construction industries. It’s from a time when the pace was steady, change was resisted, and top down was how it happened.

Relevant IT is the opposite: fast-paced, adaptive, and responsive. That’s why agile management is natural for IT: it encourages adaption, continual reassessment, early problem discovery, and faster completion.

I’m not the only one seeing this. Look at Google search trends for agile project (blue) versus waterfall project (red):

agile vs waterfall

But this isn’t just about improving how projects are done. Agile does something that waterfall can’t: manage services.

Paraphrasing Men In Black II, “Waterfall projects: old and busted. Agile services: new hotness.”

Do classical projects belong in IT?

Classical projects still have a place in relevant IT. We will still have cookie cutter projects with well-understood paths and vanilla outcomes.

However, “well-understood” and “vanilla” and are being outsourced, such as email, web systems, ERP systems, and more. If not outsourced, they may be “keep the lights on” , undifferentiated from plant operations. Or their business value is not intrinsic; the value is in what others—users, innovators, developers—can wring from them.

Agile services are the future of IT. It’s how relevant IT works, it’s how relevant IT provides business value, and it’s how relevant IT communicates what it does.

Heartbleed = overcomplexity + input validation failure

The Heartbleed vulnerability is because the OpenSSL code didn’t validate an input. It’s also because OpenSSL had unnecessary complexity.

OpenSSL has a heartbeat feature that allows clients to send up to 64 kilobytes of arbitrary data along with a field telling the server how much data was sent. The server then sends that same data back to confirm that the TLS/SSL connection is still alive. (Creating a new TLS/SSL connection can take significant effort.)

The problem is if the client specifies that it sent more data than it actually did, the server would send back the original data and some of its RAM. For example, suppose the client sent a 1K message but said it’s 64KB. In response, the server would send a 64KB message back, which was the original 1K message plus 63K of data from the server’s RAM, which could include sensitive, unencrypted data from other programs.

How this could have been prevented:

  1. Avoid pointless complexity: don’t require the client to also send the length of the arbitrary text. The server should have been able to detect the length of the text.
  2. Validate all input. The server failed to ensure that the client’s description of the text length matched its actual length. (The fact that the server could detect the message’s actual length further validates my view on #1.)

Keep it simple! In addition to driving up creation and maintenance costs, needless complexity is more opportunities for things to break.

TxDOT Dallas District to keep speed trap speed limits?

The Dallas County Sheriff’s Office recently made hay over massive noncompliance with freeway speed limits. Drivers aren’t the problem. The problem is almost all our freeway speed limits are speed traps: they are way too low.

TxDOT recently proposed 5 mph speed limit increases on a few, outlying highways. This isn’t enough.

TxDOT did a lot of speed studies in 2012. If TxDOT followed its own speed zoning procedure, then these studies require a 10 mph increase on almost all Dallas-area freeways, even inside the loop. All those 60 mph limits? Almost all should be 70 mph, some 75 mph!

Why be concerned about only 5 or 10 mph? It makes a huge difference in the percent of drivers who are criminalized. Take TxDOT’s study of I-30 at Hampton Rd. The speed limit is 60, but it should be 70. Look at the difference only 10 mph makes:

Speed limitPercentage of drivers criminalized
60 (current)83%
6551%
7014%

(Aside: It’s a myth that raised speed limits mean everyone goes that much faster. Raised speed limits just mean fewer safe drivers play a reverse lottery.)

Here’s the current speed limits at all speed checks. Red is 60 mph, orange is 65 mph.
Current speed limits in the Dallas area

Here’s what the speed limit should be, if TxDOT follows its own rules. Orange is 65 mph, green is 70 mph, and blue is 75 mph.
new speed limits
(I think the 65 mph speeds on I-30 are anomalous and should be rechecked.)

Here’s some background:

Environmental speed limits (ESLs)

In the early 2000s, mendacious bureaucrats in the North Central Texas Council of Governments inflicted environmental speed limits on the Dallas/Ft. Worth area as part of a Clean Air Act compliance plan. Already too-low speed limits were made even lower : all area 70 and 65 mph limits within roughly 50 miles of downtown Dallas and Fort Worth were reduced by 5 mph.

Environmental speed limit map
Map of environmental speed limits (ESLs) for Dallas-Fort Worth area

I wrote “mendacious” because even before ESLs were imposed, these bureaucrats knew that they were ineffective. Here’s why:

In the late 1990s, based on modeling done in EPA’s MOBILE5 software, it was believed that capping all area speed limits at 55 mph would get the area 1.5% closer to needed emissions reductions. Yes, that’s right, a whopping, practically unenforcable 10-15 mph speed limit reduction just buys us 1.5%. That was hugely unpopular in Houston and was replaced with a 5 mph reduction scheme, the same scheme that DFW got. So OK, this 5 mph reduction scheme may be more on the order of 0.5% of the emissions goal (assuming a linear relationship). Further, they found that almost all the emissions benefit was from heavy trucks, not cars.

It gets better. Before the ESLs even went into effect, they reran the models using newer software, EPA MOBILE6. Well, lo and behold, the newer software found that, at best, the emissions reduction was so small, it was a rounding error! Even newer software, EPA MOVES, now finds that there is probably no emissions benefit of lowered speed limits. Despite all this, the EPA-approved smog reduction plan had ESLs baked into them, and NCTCOG bureaucrats lacked the spines to do anything about it until over a decade later! Only in 2013 did we start to get public reports of ESLs going away.

To be clear: if ESLs go away, then TxDOT can freely raise speed limits on ESL roads. It doesn’t need to revert to the limit in effect before the ESLs. When the ESLs took effect, Texas law did not permit speed limits over 70 mph; since then, the law has been changed to allow up to 85 mph limits. Even then, there were many 65 mph limits that were themselves under-marked.

TxDOT practices

Why is TxDOT proposing only a 5 mph speed limit increase? I don’t know, and it contradicts TxDOT’s own rules.

TxDOT’s Procedures for Establishing Speed Zones require the speed limit be the 85th percentile speed rounded to the nearest 5 mph increment (reference). If the 85th percentile speed is, say, 73 mph, the speed limit must be 75 mph. You may ask, “What is the 85th percentile speed?” It’s the upper end of the “flow of traffic”. 85% of drivers’ speeds are at or under the 85th percentile speed.

TxDOT’s manual allows some deviation from this, like higher than average crashes or roadway design factors. However, by definition, most roads will not have higher than average crashes, and freeways generally are built to the highest design standards, so these cannot be used to justify TxDOT’s failure to use the 85th percentile speeds.

So what gives? I don’t know.  This is only a guess, but I think TxDOT’s Dallas Office may no longer care about the serving the public, preferring to maintain speed trap speed limits. I say this for two reasons.

First, the existence of ESLs don’t explain all our low speed limits. ESLs never applied to any roads within I-635 or I-20! ESLs only applied to roads that used to have 65 or 70 mph limits.

That means that TxDOT has always been free to raise non-ESL road speed limits. I am sure that, for years, there has been a good case to raise almost every inner-loop freeway, which were not bound by ESLs, by at least 10 mph. Why has TxDOT not bothered to do it, and why does it still drag its feet?

Second, TxDOT has previously shown active disregard for the motoring public. For example, in the late ’90s, Dallas District staff tried to impose arbitrary speed limits on Farm to Market roads that were 10-15 mph too low. While TxDOT’s Austin office rebuked them, the result was still a uniform speed limit that remained 5-10 mph too low.

I want TxDOT to stop using arbitrary, speed trap speed limits. Making all of us play a reverse lottery doesn’t make roads safer, doesn’t clean the air, and benefits nobody. If TxDOT wants to serve the motoring public and is interested in following its own policy, it will raise almost every Dallas-area freeway speed limit to 70 or 75 mph.

Want to see the actual speed studies? They’re all here: http://arencambre.com/txdotSpeedStudies/

Google is not linking to HTTPS versions of everyone’s sites

In the University Web Developer’s (UWEBD) listserv today, a conversation took off about how Google was linking to the HTTPS version of Florida Gulf Coast University’s web site. It was a problem because of FGCU’s broken HTTPS channel.

I was surprised at the misconceptions that came over a technically astute email group. Here’s my statement:

Two inaccurate things have been said about Google.

Inaccurate statement 1: Google is securing others’ sites. Dangerous misconception! Google cannot “secure” your site. If Google’s link to you uses HTTPS, that does not “secure” your site. It just means Google is linking to your site’s secure channel. “Securing” a site includes transport security (HTTPS channel) among many other things. Most importantly, YOU, the site owner, do the “securing”, not Google.

Inaccurate statement 2: Google is en masse sending users to HTTPS channels on web sites. Nope. For example, Southern Methodist University has had both HTTP and HTTPS channels for www.smu.edu for over a decade. Google links to the HTTP version.

Starting late last year, Google encrypts traffic between the user and its search site. If you visit http://google.com, Google redirects you to https://google.com. That has no bearing on whether Google’s search results link to HTTPS or HTTP channels. However, it may limit site owners’ view of search keywords (reference); that isn’t related to the inaccurate statement.

You can still get unsecured Google search using http://www.google.com/webhp?nord=1 (note the highlight), but only if you’re not signed in. A search on Florida Gulf Coast University on the unsecured version still links to the HTTPS channel.

There’s are many reasons why Google is linking to FGCU’s secure channel, but it’s almost certainly not because of Google’s own change.

Sitecore blogging song

This is a song about Sitecore blogging woes, with the answers to the woes at http://mikael.com/2013/11/sitecore-mvp-summit-team-7/. Sung to the tune of the Scout camp song “I met a bear”.

I have a blog
About Sitecore
I can’t find time
To write the blog

I like to write
About Sitecore
I don’t know what
To write about

I have a blog
About Sitecore
Nobody wants
To look at it

I like my blog
About Sitecore
MVPs want
To talk to me

I’ve got a blog
About Sitecore
It wastes my time
I need it short

Technology, politics, and stuff