tag:blogger.com,1999:blog-5809221928589206949.post2302978964024569479..comments2024-03-28T03:59:28.206-07:00Comments on [Solved] Problem: Installing Hadoop on Ubuntu (Linux) - single node - Problems you may faceAnonymoushttp://www.blogger.com/profile/01232893302882499506noreply@blogger.comBlogger162125tag:blogger.com,1999:blog-5809221928589206949.post-29655311076290298412024-03-28T03:59:28.206-07:002024-03-28T03:59:28.206-07:00smartermail server management
Want to hire the bes...<br /><a href="https://supportfly.io/smartermail-server-management/" rel="nofollow">smartermail server management</a><br />Want to hire the best server management company for Mail server management? We manage your SmarterMail server efficiently and provide support for SmarterMail outlook, SmarterMail download, to SmarterMail email servers.seo wala maihttps://www.blogger.com/profile/16362387721396686967noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-15198467055339894562024-03-10T09:32:32.992-07:002024-03-10T09:32:32.992-07:00Experience seamless multiplayer adventures with ou...Experience seamless multiplayer adventures with our <a href="https://leasepacket.com/valheim-server/" rel="nofollow">Valheim Game Server</a>. Host your own realm, forge alliances, and conquer the Viking world.aaradhya mehtahttps://www.blogger.com/profile/05830172908628382222noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-91308814876506317522024-03-05T00:06:54.314-08:002024-03-05T00:06:54.314-08:00Explore unparalleled excellence with Supportfly, y...Explore unparalleled excellence with Supportfly, your go-to <a href="https://supportfly.io/managed-google-cloud-platform/" rel="nofollow">Google Cloud managed services provider</a>. Elevate your cloud experience with top-notch solutions tailored to your needs. Trust us to optimize, secure, and streamline your operations, ensuring peak performance and peace of mind. Unleash the full potential of Google Cloud with Supportfly – your key to efficient, reliable, and cutting-edge managed services.techarinatipshttps://www.blogger.com/profile/03540669149551614400noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-58573808264889613802022-07-28T00:43:17.850-07:002022-07-28T00:43:17.850-07:00This comment has been removed by the author.Diamondexchhttps://www.blogger.com/profile/00352662693763955886noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-27841962375178883482013-10-20T16:35:11.526-07:002013-10-20T16:35:11.526-07:00After running the start script , i am seeing only ...After running the start script , i am seeing only data node and secondary name node running. Rest of the services are not running. When i see the logs, i am getting the below issue.<br /><br />Please help!!!<br /><br /><br />2013-10-20 16:22:41,193 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already exists!<br />2013-10-20 16:22:41,260 ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.lang.IllegalArgumentException: Does not contain a valid host:port authority: local<br /> at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)<br /> at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:130)<br /> at org.apache.hadoop.mapred.JobTracker.getAddress(JobTracker.java:2131)<br /> at org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1540)<br /> at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3937)<br /><br />2013-10-20 16:22:41,261 INFO org.apache.hadoop.mapred.TaskTracker: SHUTDOWN_MSG: <br />/************************************************************<br />SHUTDOWN_MSG: Shutting down TaskTracker at ubuntu.ubuntu-domain/127.0.1.1<br />************************************************************/karteekhttps://www.blogger.com/profile/13745176231942329933noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-28615704991150647892013-10-12T22:11:41.437-07:002013-10-12T22:11:41.437-07:00The program 'jps' can be found in the foll...The program 'jps' can be found in the following packages:<br /> * openjdk-7-jdk<br /> * openjdk-6-jdk<br />Try: sudo apt-get install <br /><br /><br />please help me out.Anonymoushttps://www.blogger.com/profile/12357845129255483727noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-7726613549318506472013-09-22T05:31:08.781-07:002013-09-22T05:31:08.781-07:00Hi.,
Thanks for sharing this and I'm facing a...Hi.,<br /><br />Thanks for sharing this and I'm facing a problem while i try to execute /usr/local/hadoop/bin/start-all.sh it show java path is not configured but when i change the directory name to jdk1.7.0 everything works fine but ant doesn't execute properly and vise-versa.<br /><br />so please help<br /><br />thanks<br />Anishjeethttps://www.blogger.com/profile/14270939875090381691noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-255774406762349432013-09-15T21:35:46.086-07:002013-09-15T21:35:46.086-07:00Hello Yahia Zakaria'
This is the error i am ge...Hello Yahia Zakaria'<br />This is the error i am getting when i run a map reduce application: <br /><br />Exception in thread "main" java.lang.UnsupportedClassVersionError: WordCount100cls : Unsupported major.minor version 51.0<br /> at java.lang.ClassLoader.defineClass1(Native Method)<br /> at java.lang.ClassLoader.defineClass(ClassLoader.java:634)<br /> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)<br /> at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)<br /> at java.net.URLClassLoader.access$000(URLClassLoader.java:73)<br /> at java.net.URLClassLoader$1.run(URLClassLoader.java:212)<br /> at java.security.AccessController.doPrivileged(Native Method)<br /> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)<br /> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)<br /> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)<br /> at java.lang.Class.forName0(Native Method)<br /> at java.lang.Class.forName(Class.java:266)<br /> at org.apache.hadoop.util.RunJar.main(RunJar.java:149)<br /><br /><br />Background: When i run a jar file came with hadoop-1.0.3, it worked fine but when i compile the code and create jar and run it, it fails.<br /><br />I have same version of java in hadoop and windows(i mean to say i am compiling and running with same java jdk 1.7.25 version).<br /><br />Using hadooop-1.0.3 on ubuntu 12.10<br /><br />Thank for those read my issue, Thanks in advance for helping on my issue.<br /><br />Thanks<br />Shashishashihttps://www.blogger.com/profile/02207505824111446555noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-57552927307345117642013-09-15T08:00:47.944-07:002013-09-15T08:00:47.944-07:00This comment has been removed by the author.shashihttps://www.blogger.com/profile/02207505824111446555noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-10421467763622545132013-09-10T21:57:28.230-07:002013-09-10T21:57:28.230-07:00Hi Yahia,
I m installing hadoop-0.20.2-cdh3u4 on ...Hi Yahia,<br /><br />I m installing hadoop-0.20.2-cdh3u4 on Ubuntu 12.04.<br />I have created user hadoop for the same. When I m trying to run the command start-all.sh its giving me below error:-<br /><br />hadoop@umaima-Dell-500:~$ start-all.sh<br />bash: /home/hadoop/Desktop/Cloudera/hadoop-0.20.2-cdh3u4/bin/start-all.sh: Permission denied<br /><br /><br />I even tried chmod command as:-<br /><br />hadoop@umaima-Dell-500:~$ sudo chown -R hadoop:hadoop /home/hadoop<br />[sudo] password for hadoop: <br /><br />chown: cannot access `/home/hadoop/.gvfs': Permission denied<br /><br />Can you please help me in solving this problem.<br /><br />Regards<br />Umaima B<br /><br /><br /><br />Umaima Kaderihttps://www.blogger.com/profile/02208926365854487083noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-29139412046419532972013-09-07T12:40:28.012-07:002013-09-07T12:40:28.012-07:00Thanks for all,.
I have re-configured the hadoop ...Thanks for all,.<br /><br />I have re-configured the hadoop and is working fine.. now.. <br /><br />madhavhttps://www.blogger.com/profile/17414921576761147897noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-52263890288040524442013-09-05T08:24:22.187-07:002013-09-05T08:24:22.187-07:00hi yahia,
I am getting the below exception when i...hi yahia,<br /><br />I am getting the below exception when i start the haddoop.<br /><br />hduser@Ishaanth:~$ start-all.sh<br />Warning: $HADOOP_HOME is deprecated.<br /><br />mkdir: cannot create directory `/var/run/hadoop': Permission denied<br />starting namenode, logging to /var/log/hadoop/hduser/hadoop-hduser-namenode-Ishaanth.out<br />/usr/sbin/hadoop-daemon.sh: line 138: /var/run/hadoop/hadoop-hduser-namenode.pid: No such file or directory<br />localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied<br />localhost: starting datanode, logging to /var/log/hadoop/hduser/hadoop-hduser-datanode-Ishaanth.out<br />localhost: /usr/sbin/hadoop-daemon.sh: line 138: /var/run/hadoop/hadoop-hduser-datanode.pid: No such file or directory<br />localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied<br />localhost: starting secondarynamenode, logging to /var/log/hadoop/hduser/hadoop-hduser-secondarynamenode-Ishaanth.out<br />localhost: /usr/sbin/hadoop-daemon.sh: line 138: /var/run/hadoop/hadoop-hduser-secondarynamenode.pid: No such file or directory<br />mkdir: cannot create directory `/var/run/hadoop': Permission denied<br />starting jobtracker, logging to /var/log/hadoop/hduser/hadoop-hduser-jobtracker-Ishaanth.out<br />/usr/sbin/hadoop-daemon.sh: line 138: /var/run/hadoop/hadoop-hduser-jobtracker.pid: No such file or directory<br />localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied<br />localhost: starting tasktracker, logging to /var/log/hadoop/hduser/hadoop-hduser-tasktracker-Ishaanth.out<br />localhost: /usr/sbin/hadoop-daemon.sh: line 138: /var/run/hadoop/hadoop-hduser-tasktracker.pid: No such file or directory<br /><br />I tried the chmod <br /><br />chmod 755 /home/hduser/<br />chmod -r 7777 /home/hduser/hadoop<br /><br />still not working<br /><br />Regards<br />Madhavan.TRmadhavhttps://www.blogger.com/profile/17414921576761147897noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-84652684524615630662013-08-21T00:30:24.028-07:002013-08-21T00:30:24.028-07:00I have installed hadoop 1.1.2 on Ubuntu 12.04
When...I have installed hadoop 1.1.2 on Ubuntu 12.04<br />When i run "start-all.sh"... <br /><br /><br /><br />hduser@debashis-desktop:/usr/local/hadoop/bin$ start-all.sh<br />Warning: $HADOOP_HOME is deprecated.<br /><br />starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-debashis-desktop.out<br />localhost: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-debashis-desktop.out<br />localhost: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-debashis-desktop.out<br />starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-debashis-desktop.out<br />localhost: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-debashis-desktop.out<br />hduser@debashis-desktop:/usr/local/hadoop/bin$ <br />hduser@debashis-desktop:/usr/local/hadoop/bin$ <br />hduser@debashis-desktop:/usr/local/hadoop/bin$ <br />hduser@debashis-desktop:/usr/local/hadoop/bin$ <br />hduser@debashis-desktop:/usr/local/hadoop/bin$ jps<br />7310 Jps<br />hduser@debashis-desktop:/usr/local/hadoop/bin$ ^C<br />hduser@debashis-desktop:/usr/local/hadoop/bin$ ^C<br />hduser@debashis-desktop:/usr/local/hadoop/bin$ <br /><br /><br />It doesnot show the IDs for namenode and jobtracker and tasktracker.<br />Whereas, few days ago, it used to show. I have not made any configuration changes since then...<br /><br />When I run The "stop-all.sh" I get<br /><br /><br />hduser@debashis-desktop:/usr/local/hadoop/bin$ stop-all.sh <br />Warning: $HADOOP_HOME is deprecated.<br /><br />no jobtracker to stop<br />localhost: no tasktracker to stop<br />no namenode to stop<br />localhost: no datanode to stop<br />localhost: no secondarynamenode to stop<br />hduser@debashis-desktop:/usr/local/hadoop/bin$ <br /><br /><br />Any Solutions Please.....<br />Anonymoushttps://www.blogger.com/profile/02907225176280199347noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-25782329984619069622013-08-14T01:15:39.680-07:002013-08-14T01:15:39.680-07:00have same problem. somebody help here. i opened a ...have same problem. somebody help here. i opened a root user with a command "sudo passwd root" and even so i get an error that says:"Cannot open display: <br />Run 'gedit --help' to see a full list of available command line options."Altan Koray Aydemirhttps://www.blogger.com/profile/09144292167052494949noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-75201139371304728002013-08-05T11:32:31.409-07:002013-08-05T11:32:31.409-07:00/slave directory not found.. when i run start-all..../slave directory not found.. when i run start-all.sh... what to do i'm trying to install hadoop-1.1.2 in ubuntu 10.10 mavrick ..<br />please help me.. kaushalhttps://www.blogger.com/profile/16427447121987654383noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-83442250468028086562013-08-02T14:31:11.007-07:002013-08-02T14:31:11.007-07:00Hi yahia
my single node is working properly but my...Hi yahia<br />my single node is working properly but my multi node cluster's master node is not working properly.<br />Two daemons (tasktracker and datanode) are not starting up.<br />I've checked log files and it is showing error as IOexception<br />Slave node is working fine with its three daemons..<br /><br />help ASAP<br /><br />thanks <br />HardikprobablyHDhttps://www.blogger.com/profile/09587584201919932801noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-36510297269017027992013-07-31T22:23:33.261-07:002013-07-31T22:23:33.261-07:00Hi,
$sudo gedit /home/hduser/hadoop/conf/hadoop-en...Hi,<br />$sudo gedit /home/hduser/hadoop/conf/hadoop-env.sh<br /><br /><br />export JAVA_HOME=/usr/lib/jdk1.7.0/<br /><br />after applying the above step while formating the namenode i am getting the JAVA error again and again <br />hduser@ash-virtual-machine:/$ sudo -u hdfs hdfs namenode -format<br />Error: JAVA_HOME is not set and could not be found.<br /><br />and also while running the ssh it gives following error<br /><br />hduser@ash-virtual-machine:/$ ssh localhost<br />The authenticity of host 'localhost (127.0.0.1)' can't be established.<br />ECDSA key fingerprint is ae:e9:38:d5:c4:96:0e:64:14:28:f3:d9:65:4f:aa:c0.<br />Are you sure you want to continue connecting (yes/no)? <br />Host key verification failed.<br /><br />plz help ASAP<br /><br />Thanks,<br />AshwiniAnonymoushttps://www.blogger.com/profile/03737162987869149159noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-88611274022349172582013-07-30T19:22:04.567-07:002013-07-30T19:22:04.567-07:00Hi Yahia,
Thanks for your support and now I have d...Hi Yahia,<br />Thanks for your support and now I have done with hadoop installation part. <br />Now what is the next step ? I means could you please help me to do a sample project so that I can do some exercise.<br />pankajhttps://www.blogger.com/profile/06568057041666411112noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-91771958366447367022013-07-25T05:26:32.406-07:002013-07-25T05:26:32.406-07:00it seems the core-site.xml file is not wellformed....it seems the core-site.xml file is not wellformed. Can you please open this file in the line number 8 and check the problem ? please keep posted if you solved this problem.Anonymoushttps://www.blogger.com/profile/01232893302882499506noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-53475603320850438852013-07-23T19:40:32.865-07:002013-07-23T19:40:32.865-07:00Hi Yahia,
Here is log from all files
datanode
2...Hi Yahia,<br /><br />Here is log from all files<br /><br /><br />datanode<br />2013-07-22 08:57:09,895 FATAL org.apache.hadoop.conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br />2013-07-22 08:57:09,923 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br /><br />jobtracker<br />2013-07-22 08:57:27,130 FATAL org.apache.hadoop.conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br />2013-07-22 09:28:03,432 FATAL org.apache.hadoop.conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br /><br />namenode<br />2013-07-22 08:57:07,893 FATAL org.apache.hadoop.conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br />2013-07-22 08:57:07,893 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br /><br />tasktracker<br />2013-07-22 09:28:07,082 FATAL org.apache.hadoop.conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br />2013-07-22 09:28:07,082 ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br /><br /><br />Secondarynamenode<br />2013-07-22 08:57:25,954 FATAL org.apache.hadoop.conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/home/hduser/hadoop/conf/core-site.xml; lineNumber: 8; columnNumber: 2; The markup in the document following the root element must be well-formed.<br />pankajhttps://www.blogger.com/profile/06568057041666411112noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-38114613118893676532013-07-23T15:54:04.407-07:002013-07-23T15:54:04.407-07:00Can you please send the logs you have ?Can you please send the logs you have ?Anonymoushttps://www.blogger.com/profile/01232893302882499506noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-9002467484352835782013-07-21T21:23:22.588-07:002013-07-21T21:23:22.588-07:00Hi Yahia,
Now i have fixed the problem and run ./s...Hi Yahia,<br />Now i have fixed the problem and run ./start-all.sh command successfully. Now I have run jps command like below <br />hduser@ubuntu:~/hadoop/bin$ jps <br />9968 Jps<br />only above result is showing .Its not showings NameNode,Jobtracker,Tasktracker etc like your last screen.<br />pankajhttps://www.blogger.com/profile/06568057041666411112noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-46195207514608754122013-07-21T20:18:30.508-07:002013-07-21T20:18:30.508-07:00yes, I have applied chmod command
hduser@ubuntu:~...yes, I have applied chmod command <br />hduser@ubuntu:~$ chmod 755 /home/hduser/<br /><br /><br />and then trying to access below location but not able to do this<br /><br />hduser@ubuntu:~$ cd /home/hduser/hadoop<br />bash: cd: /home/hduser/hadoop: Permission denied<br />pankajhttps://www.blogger.com/profile/06568057041666411112noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-45206582753316097692013-07-21T15:17:20.059-07:002013-07-21T15:17:20.059-07:00Hi Pankaj
No need to make hduser a root or sudoer...Hi Pankaj<br /><br />No need to make hduser a root or sudoer. hduser has full permissions to its home folder. Did you execute chmod command ?Anonymoushttps://www.blogger.com/profile/01232893302882499506noreply@blogger.comtag:blogger.com,1999:blog-5809221928589206949.post-62139205127414175622013-07-21T10:36:19.915-07:002013-07-21T10:36:19.915-07:00Hi Yahia,
Yes , I looged as hduser but still not w...Hi Yahia,<br />Yes , I looged as hduser but still not working<br /><br />This time I am trying to format agian and seeing this error<br /><br />$/home/hduser/hadoop/bin/hadoop namenode -format<br />bash: /home/hduser/hadoop/bin/hadoop : permission denied <br /><br />and if i am trying this <br />sudo $/home/hduser/hadoop/bin/hadoop namenode -format<br />then its showing below message <br />hduser is not in the sudoers file. This incident will be reported.<br /><br />Is hduser need to have cpability of sudoer if yes then how I can make it as sudoer ?<br /><br /><br />pankajhttps://www.blogger.com/profile/06568057041666411112noreply@blogger.com