亚洲激情专区-91九色丨porny丨老师-久久久久久久女国产乱让韩-国产精品午夜小视频观看

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

Hadoop學習--Mapper Reduce--day08

發布時間:2020-06-12 17:11:09 來源:網絡 閱讀:658 作者:zhicx 欄目:大數據

mapper類的代碼:

實現Mapper類的方法

import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.LongWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Mapper;


public class MyMapper extends Mapper<LongWritable, Text, Text, IntWritable> {


// constant

private static final int MISSING = 9999;


// map function

@Override

protected void map(LongWritable key, Text value, Mapper<LongWritable, Text, Text, IntWritable>.Context context)

throws IOException, InterruptedException {

// per line

String line = value.toString();


// get year

String year = line.substring(15, 19);


// get airtemp

int airTemperature;

if (line.charAt(87) == '+') {

airTemperature = Integer.parseInt(line.substring(88, 92));

} else {

airTemperature = Integer.parseInt(line.substring(87, 92));

}

// valid air temp data

String quality = line.substring(92, 93);

if (airTemperature != MISSING && quality.matches("[01459]")) {

context.write(new Text(year), new IntWritable(airTemperature));

}


}


}

Reduce類的代碼:

實現Reducer類的方法

import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Reducer;


public class MyReducer extends Reducer<Text, IntWritable, Text, IntWritable> {


@Override

protected void reduce(Text key, Iterable<IntWritable> values, Context context)

throws IOException, InterruptedException {

// max

int maxValue = Integer.MIN_VALUE;

// for

for (IntWritable value : values) {

maxValue = Math.max(maxValue, value.get());

}

// output

context.write(key, new IntWritable(maxValue));

}


}


主方法的代碼:

import mapper類實現.MyMapper;

import reducer類實現.MyReducer;


import org.apache.hadoop.fs.Path;

import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Job;

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;

import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


public class MyMapperApp {


/**

* @param args

* @throws Exception

* @throws IllegalArgumentException

*/

public static void main(String[] args) throws IllegalArgumentException, Exception {

// new job

Job job = Job.getInstance();

// find jar by ClassName

job.setJarByClass(MyMapper.class);

// job name

job.setJobName("Max temperature");

FileInputFormat.addInputPath(job, new Path("file:///mnt/hgfs/test-ncdc-data"));

FileOutputFormat.setOutputPath(job, new Path("file:///home/hadoop/mr/"));

job.setMapperClass(MyMapper.class);

job.setReducerClass(MyReducer.class);

job.setOutputKeyClass(Text.class);

job.setOutputValueClass(IntWritable.class);

System.exit(job.waitForCompletion(true) ? 0 : 1);

}


}


向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

明星| 淳安县| 华宁县| 铜陵市| 惠安县| 故城县| 女性| 尉氏县| 莲花县| 清水河县| 金川县| 五河县| 咸丰县| 抚宁县| 明溪县| 澄江县| 托克逊县| 肇源县| 信宜市| 沙雅县| 宝山区| 峡江县| 石阡县| 马龙县| 镇雄县| 曲靖市| 卫辉市| 远安县| 兴海县| 合水县| 祁阳县| 延津县| 融水| 乡城县| 华安县| 靖边县| 随州市| 原平市| 河池市| 青浦区| 中方县|