The following java program is an example of how you can programmtically read a file in HDFS using HDFS API's bundled with Hadoop
1. Open File Cat.java and paste the following code
2. Compile the code
3. Create jar
4. Run
1. Open File Cat.java and paste the following code
package org.myorg;
import java.io.*;
import java.util.*;
import java.net.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.util.*;
public class Cat{
public static void main (String [] args) throws Exception{
try{
Path pt=new Path("hdfs://jp.seka.com:9000/user/john/abc.txt");
FileSystem fs = FileSystem.get(new Configuration());
BufferedReader br=new BufferedReader(new InputStreamReader(fs.open(pt)));
String line;
line=br.readLine();
while (line != null){
System.out.println(line);
line=br.readLine();
}
}catch(Exception e){
}
}
}
2. Compile the code
javac -classpath hadoop-0.20.1-dev-core.jar -d Cat/ Cat.java
3. Create jar
jar -cvf Cat.jar -C Cat/ .
4. Run
hadoop jar Cat.jar org.myorg.Cat
No comments:
Post a Comment