Skip to main content

Upgrading nodejs in ubuntu 14.04

My machine has 5.x installed and had lot of trouble updating it to 8.x. Below are the steps I followed to upgrade nodejs from 5.x to 8.x

#add the new source list
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv 68576280
 sudo apt-add-repository "deb https://deb.nodesource.com/node_8.x $(lsb_release -sc) main"

sudo apt-get update

#Remove the previous installation

sudo apt-get purge nodejs npm
 #Verify if proper version is going to be installed
apt-cache policy <package>

#Install new version

sudo apt-get install -y nodejs

Comments

Popular posts from this blog

org.apache.hadoop.yarn.exceptions.YarnException: Unauthorized request to start container.

Recently installed the latest cloudera hadoop. First issue I faced while working with hive. Diagnostic Messages for this Task: Container launch failed for container_1406173012885_0009_01_000021 : org.apache.hadoop.yarn.exceptions.YarnException: Unauthorized request to start container . This token is expired. current time is 1406254943000 found 1406254938244     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:152)     at org.apache.hadoop.yarn.api.records.impl.pb.Serial...

spark java.lang.IllegalArgumentException: java.net.UnknownHostException: user

Today I faced an error while trying to use Spark shell. This is how I resolved. scala> val file = sc.textFile("hdfs://...") 14/10/21 13:34:23 INFO MemoryStore: ensureFreeSpace(217085) called with curMem=0, maxMem=309225062 14/10/21 13:34:23 INFO MemoryStore: Block broadcast_0 stored as values to memory (estimated size 212.0 KB, free 294.7 MB) file: org.apache.spark.rdd.RDD[String] = MappedRDD[1] at textFile at <console>:12 scala> file.count() java.lang.IllegalArgumentException: java.net.UnknownHostException: user     at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)     at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:237)     at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141) This error can be fixed by giving proper hostname and port sc.textFile("hdfs://{hostname}:8020/{filepath}...") scala> file.count() 14/10/21 13:44:23 IN...