The Internet and the Office Roof

Reading Time: 6 minutes

I thought I’d got away without any ill effects from the “St Jude” storm on Monday – not so. My Internet failed yesterday morning; I felt like I’d had an arm cut off.

I’m a remote worker – have been for 2 years (a fact that I really should blog more about). I knew the lack of Internet was going to cause problems but I didn’t realise exactly how much.

Tom on the office roof
Some Internet, at least!

The first thing I had to do was call the ISP to find out what was up. What’s their phone number? Well it used to be on a piece of paper under the router along with all the technical (but not security) details of the connection. The paper was missing. There’s no mobile phone signal at my address so the next image I would like to present to you is me, stood on the roof of the office at the end of the garden waving my phone in the air because if you’re really lucky on a clear day you can get some mobile signal there. Eventually I got enough mobile Internet packets to find their phone number.
Apparently the Internet problem is due to local power problems and they can’t give me a fix time because the power company won’t give them one.

No problem I say to myself – my team know what they’re doing they can survive without me as long as it’s not days. I’ll just let them know. Precisely how am I going to do this then? I’m usually pretty easy to contact, Email, Skype, MS Lync, Google Talk, SIP phone, none of which are going to work. I don’t even have any of the team’s mobile numbers. Telephone again then. The next problem is that I never actually call the Departmental Office so I don’t know the number.
That’s me on the roof of my office again waving a mobile phone around for 10 minutes trying to find the number before I realise that although Outlook won’t connect I will still have my email and the office number will be in someone’s signature – sorted.

Now down to business and for the first time I’m ahead of the game. I know I won’t be able to connect to the Team Foundation Server so I take a back-up of the source tree and then open the solution offline. The problem I’m working on is a complex one, the code didn’t make much sense – I mean I understood what it did, but what it did didn’t seem logical. I needed some context so I wondered if it had always been done like this. Usually I’d look at the module history, but that’s all in the Team Foundation Server that I don’t currently have access to.

Never mind I think, I’ll tackle a different problem, one that’s SQL Server based. Quickly I realise that I can’t remember the syntax for the OVER / PARTITION aggregate functionality. So I have a go, get it wrong and then hit F1 for some help. Which is online, so that’s a non-starter.
But I do have a book on SQL Server that I can refer to. In Safari, the online book system. That’s not going to work either.

Clearly this lack of Internet is becoming quite a problem. I am inventive though and I’ve got a spare satellite TV dish. I wonder to myself if perhaps I mount my mobile phone at the focal point of the dish and direct it at the most line-of-sight phone mast I might just get enough signal strength to tether the phone. I suspect that the answer is a strong no but maybe there is something sensible and practical I can do to improve the signal. I’ll just Google… oh wait, no I won’t.

Small scale disaster planning though is something I blogged about a while ago so I should be on top of this right? The primary backup plan for power / internet loss is to go to my in-law’s house in the nearby town. The problem is that they’re at work. My wife has a key but she’s away on business. To be honest this kind of suits me because I find laptop screens rather limiting, I far prefer the large pair of monitors I have in my office so if I can manage without Internet that’s better for me.

Nevertheless it would be good for me to get some Internet at some point, check emails, make sure the team are OK etc. The secondary backup plan is to find some public WiFi. There’s none in the village (and it wouldn’t work even if there was) so I ponder popping into the local town for lunch – the car is also away with my wife on business but the weather looks OK and I have a bicycle and a laptop backpack bag. I wonder what the weather forecast will be for later though – I’ll just look that up on the Internet… erm, no. Oh well, I can risk it, I have wet weather gear. I’m certainly not going to risk it though unless I can guarantee a cafe or coffee shop that has WiFi and I don’t remember seeing any adverts when I was last there. I’ll just Google… oh no, wait, no I won’t.

Just as I’m thinking this something pops into the back of my mind – I’m sure I remember reading when I signed up for my broadband Internet access that they had a backup dial-in option. The details of that will also be on that missing piece of paper but I remember printing it out from an email. So it’s back onto the office roof for me searching through my email history to find the details. They do indeed, but sadly it’s a national call rate number which means it would get expensive very quickly – but it was enough just to check my email.

At about 9:30pm the situation got even worse, I lost electricity as well. I live in the country where there is no street lighting. I was on the stairs and it suddenly went dark. I was prepared for this – there were torches at strategic points around the house and I got the candles and lighter out so I could find them in the dark.

What I wasn’t quite prepared for was how dark it was, it was pitch black. So I sat down with a view to waiting until my eyes picked up some light. Unusually for the house however I had my mobile phone in my pocket so a few seconds later I was carefully making my way down the stairs to one of the torches.

Electricity is another strange thing – it’s easy to overlook how important it has become to our lives.

The freezer was defrosting. As I had no idea how long I’d be without power the only thing I could do was put some towels under it and hope it wasn’t too long.

The central heating is gas, but of course the pump and all the controls are electric. Fortunately I have a multi-fuel stove which is definitely not electric in any way at all and being honest it wasn’t that cold.

Fortunately the cooker is dual-fuel, which means that the gas rings still work. It does leave me without a grill or oven though which limits what I can cook.

The house phone is a cordless model – the handset might work but the base station won’t. I keep an old fashioned corded handset just for this, so I was able to plug that in. It is worth noting that you shouldn’t assume that if you’ve got no power the phone won’t work. In reality it usually does – in the UK at least.

My alarm clock is a Lumie Bodyclock – it’s great most of the time but it is very definitely mains powered. I don’t have a mechanical alarm clock, but I do have a mobile phone so there was a solution there.

The weirdest thing though was this – it was night and I didn’t know which of the house lights were turned on and which off. This meant that if the power came back on at say 3am (which it did) some of the house lights would turn on, possibly without me knowing.

This morning I had power but still no Internet, so I’m currently sat at my in-law’s dining room table borrowing their Internet and just having to deal with the tiny laptop screen and horrid keyboard.

Being well prepared was definitely worth it – clearly I knew that losing electricity would be a serious inconvenience but just how much I rely on the Internet surprised me considerably.

C#’s IEnumerable – Two Simple Gotchas

Reading Time: 5 minutes
Dummies in suits...
Dummies in suits…

The other day I saw someone do something very much like this and it made me cringe a little. There’s no problem with this when the number of items that are being dealt with is small, but as the number of items increases performance will take a spectacular nosedive.

        static string[] composers = {"Pyotr Ilyich Tchaikovsky",
                                    "Gabriel Fauré",
                                    "Sergei Sergeyevich Prokofiev",
                                    "Jean Sibelius",
                                    "Ludwig van Beethoven",
                                    "Benjamin Britten",
                                    "Erik Satie"};


        static void Main(string[] args)
        {
            Console.WriteLine(" L I S T I N G   S H O R T   N A M E D   C O M P O S E R S");
            Console.WriteLine(" =========================================================");
            IEnumerable<string> shortComposers = composers.Where(x => x.Length < 15);
            for (int i = 0; i < shortComposers.Count(); i++)
            {
                Console.WriteLine("Composer {0} => {1}", i, shortComposers.ElementAt(i));
            }


            Console.Write(" - PRESS ANY KEY TO EXIT - ");
            Console.ReadKey();
        }

What’s the problem? Well in this trivial example nothing at all, but if the list of names were a bit longer then we could find that this code runs very slowly indeed. There are two problems.

C#’s for loop combined with IEnumerable’s .Count() method is somewhat toxic.
In order to understand why we need to look at what .NET is doing behind the scenes. Originally I used the rather excellent IL Spy to decompile part of the .NET Framework itself.
Since then Microsoft have actually published the source – so let’s take a look at what .Count() really does.

public static int Count<TSource>(this IEnumerable<TSource> source) 
{
    if (source == null) 
        throw Error.ArgumentNull("source");

    ICollection<TSource> collectionoft = source as ICollection<TSource>;
    if (collectionoft != null) 
        return collectionoft.Count;

    ICollection collection = source as ICollection;
    if (collection != null) 
        return collection.Count;

    int count = 0;
    using (IEnumerator<TSource> e = source.GetEnumerator()) 
    {
        checked 
        {
            while (e.MoveNext()) count++;
        }
    }
    return count;
}

(original here)

So if it’s a ICollection<T> or a plain ICollection it returns the Count property of the object, if not it gets the enumerator and counts up every item in the IEnumerable.

Now we have to look at C#’s for loop and how it’s evaluated. If we look at the C# 4.0 language specification it says;

A for statement is executed as follows:

  • If a for-initializer is present, the variable initializers or statement expressions are executed in the order they are written. This step is only performed once.
  • If a for-condition is present, it is evaluated.
  • If the for-condition is not present or if the evaluation yields true, control is transferred to the embedded statement. When and if control reaches the end point of the embedded statement (possibly from execution of a continue statement), the expressions of the for-iterator, if any, are evaluated in sequence, and then another iteration is performed, starting with evaluation of the for-condition in the step above.
  • If the for-condition is present and the evaluation yields false, control is transferred to the end point of the for statement.

Or in other words every time the loop goes round the Count() method is called. So if there are 1000 items in the list then the Count() method will be called 1000 times. Each time the Count() method itself is called it will get the enumerator and count up 1000. So that’s more than 1,000,000 operations that are performed when only 1000 should have been.

It’s easy to fix…

            int shortComposersCount=shortComposers.Count();
            for (int i = 0; i < shortComposersCount; i++)
            {
                Console.WriteLine("Composer {0} => {1}", i, shortComposers.ElementAt(i));
            }

Now there are 2000ish operations performed because the Count() method is only called once. Note that there is some overhead in creating the shortComposersCount variable, so for small numbers you may find that it’s faster just to leave the Count() where it was. If you do leave it as was though be aware that performance will degrade exponentially as the count increases.

.ElementAt() is inefficient
To look at the second problem let’s fire up the Dot Net Reference Source again and take a look at what .ElementAt() does.

public static TSource ElementAt<TSource>(this IEnumerable<TSource> source, int index) 
{
    if (source == null) 
        throw Error.ArgumentNull("source");

    IList<TSource> list = source as IList<TSource>;
    if (list != null) 
        return list[index];

    if (index < 0) 
        throw Error.ArgumentOutOfRange("index");

    using (IEnumerator<TSource> e = source.GetEnumerator()) 
    {
        while (true) 
        {
            if (!e.MoveNext()) 
                 throw Error.ArgumentOutOfRange("index");
            if (index == 0) 
                return e.Current;
            index--;
        }
    }
}

(original here)

Having seen what Count() does that won’t surprise you. If the IEnumerable doesn’t implement IList then ElementAt() gets the enumerator and calls MoveNext() the requisite number of times to move through to the specified position, then returns the current item. This is not good – we’re performing a whole heap of unnecessary operations and the performance drop off as the number of items increases will be horrid.

So we need a bit of a rethink – in this case simply using a foreach instead of a for fixes everything neatly.

            int i=0;
            foreach (string shortComposer in shortComposers)
            {
                Console.WriteLine("Composer {0} => {1}", i++, shortComposer);
            }

This way the enumerator is only got and enumerated once – the performance improvement of this way of doing it over the original for loop implementation will be an eyebrow-raiser.

Comparing this with an array

But what if we have to use a for loop (for some other reason)?

I won’t go into the details of the implementation of the C# array but…

  • It has a Length property (not a method) that directly returns the size of the array, it doesn’t do any counting.
  • It (obviously) has an indexer property that directly returns an item from an array – again no counting involved.

So an array also solves both of our problems…

            string[] shortComposers = composers.Where(x => x.Length < 15).ToArray();            
            for (int i = 0; i < shortComposers.Length; i++)
            {
                Console.WriteLine("Composer {0} => {1}", i, shortComposers[i]);
            }

This isn’t bad, but you must be aware that the instantiation of a new object – the array – introduces quite a bit of overhead. You may find that using this construct is slower than the original in some surprising circumstances. For instance if the item is a value type (such as a struct or string) then the overhead of creating an array of them can easily be more than the cost of doing all those extra operations. Reference types (e.g. objects) will be less prone to this.

It's not safe!
It’s not safe!
Conclusion
Be aware that a lot of LINQ operations end up iterating through item by item performing some operation or other. This fact can be hidden in innocuous looking method calls such as ElementAt(). The number of unnecessary operations can very quickly get out of hand, especially when combined with other iterative operations.

If in doubt, use the source. If you haven’t got the source for whatever you’re using then, if it’s legal to do so where you live, crank up your favourite .NET decompiler and find out what’s going on underneath.