Why doesn’t window.something throw an error?

In this StackOverflow question I asked why doesn’t window.something throw an error

profile for Nikola at Stack Overflow, Q&A for professional and enthusiast programmers
I’m a big fan of Stack Overflow and I tend to contribute regularly (am currently in the top 0.X%). In this category (stackoverflow) of posts, I will be posting my top rated questions and answers. This, btw, is allowed as explained in the meta thread here.

As you may know, I’m really into Ionic framework lately and am helping out on StackOverflow with the knowledge I gained so far with the framework. I’m currently #3 in the top All time answerers list.

I actually asked this question myself:

I’m using an AdMob plugin in my Ionic application, and one way, as you can see in the documentation, to test if the plugin is properly loaded is to do:

if(AdMob) {
    ///other config code...
}

Now, this works perfectly fine on a device. However, it doesn’t work in the browser; it throws an error in the console log: AdMob is not defined.

I have found a solution to test the existence of plugin like this (without throwing an error in the console):

if (window.AdMob){...}

And I have seen this usage on multiple questions here on StackOverflow. However, I wasn’t able to find an explanation to as why this doesn’t throw an error.

I have a vague reasoning to as why this would be so, but I would really appreciate it if someone experienced could explain it in more detail.

edit: I made additional tests like this:

var a = "hi";
console.log(a); //shows "hi"
console.log(b); //throws an error that b is not defined

var c = {};
c.b = "hi again";
console.log(c.b); //shows "hi again" as expected

//and now for the grand finale
console.log(c.something);//doesn't throw an error, please explain to me in more detail why?

The answer by T.J. Crowder was:

I wasn’t able to find an explanation to as why this doesn’t throw an error.

In the first example, you’re trying to read the value of a completely undefined identifier. In the second example, you’re trying to read a property from an object that the object may not have.

Trying to read the value of an undefined identifier is a ReferenceError; the JavaScript engine has no idea what that identifer is. In contrast, trying to read the value of a property that the object doesn’t have yields the value undefined.

It’s just how the language is designed, where Brendan Eich drew the line: It’s okay to read the value of a non-existant property from an object, but not okay to read the value of an undeclared identifier.

I should point out a third option: typeof. You’re allowed to provide an undefined identifier as the operand to typeof:

if (typeof AdMob === "undefined")

That won’t throw a ReferenceError even if AdMob is undeclared; instead, typeof will yield "undefined". (It will also yield "undefined' if AdMob is a declared identifier with the value undefined in it.)

In a comment on another answer, you said:

…it would just indeed help to see the exact official specification which confirms this.

That would be the ECMAScript specification, specifically §6.2.3.1 for throwing a ReferenceError on an unresolvable symbol, and §9.1.8 for returning undefined for a property that doesn’t exist. But I should warnin you that the spec, especially this 6th edition spec, is very heavy going. 🙂

Written by Nikola Brežnjak