Why is the fileName of the orc file I read out in lowercase?

the write orc file code is:

public static <T> void write(Class cls, List<T> datas,Properties props)  throws Exception{

    String path = props.getProperty("localTempFile");

    JobConf conf = new JobConf();
    FileSystem fs = FileSystem.get(conf);
    Path outputPath = new Path(path);

    fs.delete(outputPath,true);

    OrcSerde serde = new OrcSerde();
   
    StructObjectInspector inspector =
            (StructObjectInspector) ObjectInspectorFactory
                    .getReflectionObjectInspector(cls,
                            ObjectInspectorFactory.ObjectInspectorOptions.JAVA);
    OutputFormat outFormat = new OrcOutputFormat();

    RecordWriter writer = outFormat.getRecordWriter(fs, conf, outputPath.toString(),
            Reporter.NULL);

    for(int i= 0; i < datas.size();iPP){
        T data = datas.get(i);
        System.out.println(String.format("dats:,%s,JSON:%s,", data, JSON.toJSONString(data)));
        writer.write(NullWritable.get(), serde.serialize(data,inspector));
    }
    writer.close(Reporter.NULL);

    System.out.println(fs.deleteOnExit(new Path("." + path + ".crc")));
    fs.close();
    System.out.println("write success .");

}

the test class is:

static class Student {
    private String NAME;
    private int AGE;

    public Student() {
    }

    public String getNAME() {
        return NAME;
    }

    public void setNAME(String NAME) {
        this.NAME = NAME;
    }

    public int getAGE() {
        return AGE;
    }

    public void setAGE(int AGE) {
        this.AGE = AGE;
    }
}

the read orc file is:

 public static void read(String path) throws IOException, IOException {
    Configuration conf = new Configuration();
    conf.set("mapreduce.framework.name", "local");
    conf.set("fs.defaultFS", "file:///");

    Reader reader = OrcFile.createReader(
            new Path(path),
            OrcFile.readerOptions(conf));

    RecordReader records = reader.rows();

    Object row = null;

    StructObjectInspector inspector
            = (StructObjectInspector) reader.getObjectInspector();

    List fields = inspector.getAllStructFieldRefs();

    for (int i = 0; i < fields.size(); PPi) {
        System.out.println("FieldName:" + ((StructField) fields.get(i)).getFieldName() + "\t");
        System.out.println("FieldID:" + ((StructField) fields.get(i)).getFieldID() + "\t");
        System.out.println("FieldComment:" + ((StructField) fields.get(i)).getFieldComment() + "\t");
        System.out.println("TypeName:" + ((StructField) fields.get(i)).
                getFieldObjectInspector().getTypeName() + "\t");
        System.out.println("name:" + ((StructField) fields.get(i)).
                getFieldObjectInspector().getCategory().name() + "\t");
        System.out.println("ordinal:" + ((StructField) fields.get(i)).
                getFieldObjectInspector().getCategory().ordinal() + "\t");
        System.out.println("\n");
    }

    while (records.hasNext()) {
        row = records.next(row);
        List value_lst = inspector.getStructFieldsDataAsList(row);
        StringBuilder builder = new StringBuilder();
        //iterate over the fields
        //Also fields can be null if a null was passed as the input field when processing wrote this file
        for (Object field : value_lst) {
            if (field != null) {
                builder.append(field.toString());
            }
            builder.append("\t");
        }
        //    this writes out the row as it would be if this were a Text tab seperated file
        System.out.println(builder.toString()+"\n");
    }

}

Mar.29,2021
  • Kafka.common.KafkaException: Wrong request type 18

    use a simple java client to send a message after simulating the configuration of a kafka server, but without message storage, you can only see the error log all the time (the message is really not stored) kafka.common.KafkaException: Wrong request type...

  • Hadoop connected to Aliyun ESC reported an error

    1. There is no problem with the local test, and there is a problem that you cannot connect to the formal environment when you go online. 2. Now install all the hadoop clusters on one machine and rule out the network reasons. ...

    Apr.11,2021
  • About the problem of file access in hdfs?

    use hdfs to store pictures for the first time, and use java in the background to display pictures on the page. What are the specific solutions? 1. Tried using webhdfs , but redirected to datanode when accessing namenode , so cancel this scheme...

    Apr.12,2022
Menu