简体   繁体   中英

Understanding Javascript immutable variable

I am trying to understand what a Javascript immutable variable means. If I can do:

var x = "astring";
x = "str";
console.log(x); //logs str` , then why it is immutable?

The only answer I can think (from the little bit of CI know) is that var x is a pointer to a memory block with the value "astring", and after the 2nd statement it points to another block with the value "str". Is that the case?

And a bonus question: I was confused by the value types of Javascript. Are all variables objects under the hood? Even number and strings?

Values are immutable; variables are not; they hold a reference to their (primitive) values.

The three primitive types string, number and boolean have corresponding types whose instances are objects: String, Number, Boolean .
They are sometimes called wrapper types .

The following values are primitive :

  • Strings: "hello"
  • Numbers: 6, 3.14 (all numbers in JavaScript are floating point)
  • Booleans: true, false
  • null: usually explicitly assigned
  • undefined: usually the default (automatically assigned) value

All other values are objects, including wrappers for primitives.

So:

  • Objects are mutable by default
  • Objects have unique identities and are compared by reference
  • Variables hold references to objects
  • Primitives are immutable
  • Primitives are compared by value, they don't have individual identities

You might find The Secret Life of JavaScript Primitives a good explanation.

Also, in ES6 there is a new const keyword, that creates a read-only named constant that cannot change value through assignment or be re-declared while the script is running.

Hope this helps!

First, in C "A string is an array of characters with last elem = '\\0' ". They are mutable.
If you declare and initialize a string in C like this:

char str[] = "Foo";

What you are basically doing is reserving 4 bytes ( probably 8bit-byte, don't mind this probably if it hurts you ). The word str serves as a pointer to the first elem of this array. So, if you do like this:

str[0] or *(str) = 'G'

then it will mutate the value at that address instead of creating new array. You can verify it by printing out the address of str. In both cases it will be same.

Now in case of JavaScript string is a primitive type. All operations on string are done by value instead of by reference. So, doing this will produce true.

var str1 = "foo";
var str2 = "foo";
str1 === str2; => true

The initialization of string asks for a buffer to fit "foo" and binds the name str1 to it. What makes them immutable is that you can't change that buffer. So, you can't do this:

str1[0] = 'G'

Executing this command will produce no warning or error in non-strict mode but, it will not change the str1. You can verify it by

console.log(str1) => "foo"

But if you do like this:

str1 = "goo"

what you are actually doing is that you are asking for a new buffer to fit "goo" and bind identifier str1 to it. No change in that old buffer containing "foo".

So, what happens to "foo"?

Java Script has an automatic garbage collector. When it sees some chunk of memory that no longer can be referenced by any identifier or ... then it consider that memory free.

Same happens to number,booleans. Now, about wrapper objects! Whenever you try to access a property on string like this:

str1.length;

What JavaScript does it creates a new object using String class and invoke the methods on string. As soon as the function call returns, the object is destroyed. The below code explains it further:

var str = "nature"; 
str.does = "nurtures"; //defining a new property; 
console.log(str.does) => undefined

because the object has been destroyed. Try this!

var str = new String("Nature");
str.does = "nurtures";
console.log(str) =>  ??

this str is really an object...

Conclusion: In C , in a single scope the variable name serves as a pointer. So, int, float, string all are mutable. But in Java Script a primitive type variable name serves as value not as reference

References: C++ primer plus, Java Script The Definitive Guide, C by Stephen Kochan

You are correct. Strings (and numbers) are immutable in java script (and many other languages). The variables are references to them. When you "change the value of a variable" you are changing the string (or whatever) that the variable references, not the value itself.

I think many new programmers believe immutability to mean that primitive values cannot be changed by reassignment.

 var str = "testing"; var str = "testing,testing"; console.log(str); // testing, testing 

 var fruits = ["apple", "banana", "orange"]; fruits[0] = "mango"; console.log(fruits); //["mango", "banana", "orange"] 

The values associated with both mutable and immutable types can be changed through reassignment as the above examples with strings and arrays show. But then, these data types have associated functions(methods) that are used to manipulate the values belonging to each data type. This is where mutability/immutability is seen. Since arrays are mutable, any manipulation by an array method affects the array directly. For example,

 var fruits = ["mango","banana", "orange"]; fruits.pop(); console.log(fruits) //["mango", "banana"] The array.pop() method deleted "orange" from the original fruits array. But with strings for example, var name = "Donald Trump"; name.replace("Donald", "President"); console.log(name)//Donald Trump the original string remains intact! 

Immutability disallowed any altering of the original string by the string method. Instead, the method produces a new string if the method operation is assigned to a variable like so:

 var name = "Donald Trump"; var newName = name.replace("Donald", "President"); console.log(newName);//President Trump 

Let's understand here, first,

let firstString = "Tap";

console.log(firstString);  //Output: Tap 

firstString[0] = "N";

console.log(firstString)   //Output: Tap

This is where we can see the immutable effect !

Immutability in this definition is historic. It's attached to what could be done in OTHER programming languages.

I think that is the first thing to understand. And to programmers who have only used JavaScript the question may seem nonsensical or needlessly pedantic. Describing primitives as immutable is like describing ice cream as not being able to jump. Why would I think it could? It is only in relation to other historic programming languages that the lack of mutability is apparent when dealing with primitive types.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM