首页
编程语言
数据库
网络开发
Algorithm算法
移动开发
系统相关
金融统计
人工智能
其他
首页
>
> 详细
CSci415留学生讲解、辅导Java编程、讲解Java、辅导Hadoop 解析Haskell程序|辅导Web开发
CSci415 Assignment 3
Due: Nov 7th, 11:59pm. You will also have to give a presentation/demo in class.
In this programming assignment, you will practice parallel programming. You are recommended
to form a group of three, although individual work is allowed. Do not accept late submission.
Objectives
The goal of this programming assignment is to enable you to gain experience in:
1. Basic features of Hadoop distributed file system and MapReduce
2. Writing a program for finding common friends between all pairs of nodes in a large Network.
Language:
You are required to use Java as the implementation language.
Final Submission and Demo:
3. Submit a report + code to blackboard
You should hand in a (1) report which presents your implementation, (2) a README file
explaining how to run your program, and of course (3) your source code and the
executable files.
4. You need to demonstrate working of your implementation to the class on Demo Day.
The grading will be based on all these parts.
Total Marks: 100
---------------------------------------------------------------------------
Description
In large-scale social networks, there are normally tens of millions of users. The task of this
assignment is to implement a MapReduce program to identify common friends among all pairs
of users. Let U be a set of all users: {U1, U2, ..., Un}. Then the goal is to find common friends
for every pair of (Ui, Uj) where i ≠ j.
The input files have the following format:
A B C D E
B A C D
C A B
D A B E
E A D
where the first token in the line is the user and the remaining tokens in the line are the friends.
So, for line 1, A has four friends, B, C, D, and E. For example, A, and B have C, and D as their
common friends. Also, A and E have only D as their common friend.Use the social network profiles provided that is attached with this assignment as input to your
program. The network has been partitioned into three files: file01, file02, file03.
The output will be stored in a file in the output directory.
Below is a tutorial helping you getting started with the Hadoop installed in the department
cluster.
Getting Started with Hadoop
Logging In
First, make sure you can log in to the head node with SSH, currently at zoidberg.cs.ndsu.nodak.edu.
You can log in to this server with your CS Domain password or your Blackboard password.
Example: WordCount
Before we jump into the details, lets walk through an example MapReduce application to get a flavor
for how they work.
Source Code
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class WordCount {
public static class TokenizerMapper
extends Mapper
{
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(Object key, Text value, Context context
) throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
word.set(itr.nextToken());
context.write(word, one);
} }
}
public static class IntSumReducer
extends Reducer
{
private IntWritable result = new IntWritable();
public void reduce(Text key, Iterable
values,
Context context
) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
result.set(sum);
context.write(key, result);
}
}
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
Job job = Job.getInstance(conf, "word count");
job.setJarByClass(WordCount.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
Usage
Set environment variables:
export JAVA_HOME=/path/to/your/jdk (path where java jdk is installed, e.g.,
export JAVA_HOME="/usr/lib/jvm/java-8-openjdk-amd64")
export HADOOP_CLASSPATH=${JAVA_HOME}/lib/tools.jar
Compile WordCount.java and create a jar:
$ bin/hadoop com.sun.tools.javac.Main WordCount.java
(or hadoop com.sun.tools.javac.Main WordCount.java)
$ jar cf wc.jar WordCount*.classSetting Up Input Files
This program can use the Hadoop Distributed File System (HDFS) that is set up in the CS
department. This file system spans all the Linux lab machines and provides distributed storage for
use specifically with Hadoop.
You can work with HDFS with UNIX-like file commands. The list of file commands can be
found here.
First, make a directory to store the input for the program (use your username).
yourName@zoidberg:~$ hadoop fs -mkdir /user/yourName/wordcount
yourName@zoidberg:~$ hadoop fs -mkdir /user/yourName/wordcount/input
To set up input for the WordCount program, create two files as follows:
file01:
Hello World Bye World
file02:
Hello Hadoop Goodbye Hadoop
Save these to your home folder on the head node. To move them into HDFS, use the following
commands:
yourName@zoidberg:~$ hadoop fs -copyFromLocal /home/yourName/file01
/user/yourName/wordcount/input/file01
yourName@zoidberg:~$ hadoop fs -copyFromLocal /home/yourName/file02
/user/yourName/wordcount/input/file02
Again, use your username where applicable.
The syntax here is “hadoop fs -copyFromLocal
”, in this case we're
going to copy file01 from the local system into HDFS under our HDFS user directory into the
wordcount/input/ directory.Running the WordCount Program
You can now run the WordCount program using the following command:
hadoop jar wc.jar WordCount /user/yourName/wordcount/input
/user/yourName/wordcount/output
The command syntax is: “hadoop jar
”
In this case, we use the wc.jar JAR file, running the class 'WordCount' with two parameters, an input
directory and an output directory. The output directory must not already exist in HDFS, it will be
created by the program.
View Output
You can check the output directory with:
hadoop fs -ls /user/yourName/wordcount/output/
You should then see something similar to:
yourName@zoidberg:~$ hadoop fs -ls /user/yourName/wordcount/output
Found 2 items
-rw-r--r-- 3 yourName nogroup 41 2011-11-08 11:23
/user/yourName/wordcount/output/part-r-00000
The 'part-r-00000' file contains the results of the word counting. You can look at the file using the
'cat' command.
yourName@zoidberg:~$ hadoop fs -cat /user/yourName/wordcount/output/part-r-00000
Bye 1
Goodbye 1
Hadoop 2
Hello 2
World 2Another Example: Bigram
The following example may help you write your assignment program.
Bigrams are simply sequences of two consecutive words. For example, the previous sentence contains the
following bigrams: "Bigrams are", "are simply", "simply sequences", "sequences of", etc.
A Bigram Counting application that can be composed as a two-stage Map Reduce job.
o The first stage counts bigrams.
o The second stage MapReduce job takes the output of the first stage (bigram counts).
You can read and run the attached Bigram source code to learn. (To run the program, you need to create
input file(s).
联系我们
QQ:99515681
邮箱:99515681@qq.com
工作时间:8:00-21:00
微信:codinghelp
热点文章
更多
辅导 comm2000 creating socia...
2026-01-08
讲解 isen1000 – introductio...
2026-01-08
讲解 cme213 radix sort讲解 c...
2026-01-08
辅导 csc370 database讲解 迭代
2026-01-08
讲解 ca2401 a list of colleg...
2026-01-08
讲解 nfe2140 midi scale play...
2026-01-08
讲解 ca2401 the universal li...
2026-01-08
辅导 engg7302 advanced compu...
2026-01-08
辅导 comp331/557 – class te...
2026-01-08
讲解 soft2412 comp9412 exam辅...
2026-01-08
讲解 scenario # 1 honesty讲解...
2026-01-08
讲解 002499 accounting infor...
2026-01-08
讲解 comp9313 2021t3 project...
2026-01-08
讲解 stat1201 analysis of sc...
2026-01-08
辅导 stat5611: statistical m...
2026-01-08
辅导 mth2010-mth2015 - multi...
2026-01-08
辅导 eeet2387 switched mode ...
2026-01-08
讲解 an online payment servi...
2026-01-08
讲解 textfilter辅导 r语言
2026-01-08
讲解 rutgers ece 434 linux o...
2026-01-08
热点标签
mktg2509
csci 2600
38170
lng302
csse3010
phas3226
77938
arch1162
engn4536/engn6536
acx5903
comp151101
phl245
cse12
comp9312
stat3016/6016
phas0038
comp2140
6qqmb312
xjco3011
rest0005
ematm0051
5qqmn219
lubs5062m
eee8155
cege0100
eap033
artd1109
mat246
etc3430
ecmm462
mis102
inft6800
ddes9903
comp6521
comp9517
comp3331/9331
comp4337
comp6008
comp9414
bu.231.790.81
man00150m
csb352h
math1041
eengm4100
isys1002
08
6057cem
mktg3504
mthm036
mtrx1701
mth3241
eeee3086
cmp-7038b
cmp-7000a
ints4010
econ2151
infs5710
fins5516
fin3309
fins5510
gsoe9340
math2007
math2036
soee5010
mark3088
infs3605
elec9714
comp2271
ma214
comp2211
infs3604
600426
sit254
acct3091
bbt405
msin0116
com107/com113
mark5826
sit120
comp9021
eco2101
eeen40700
cs253
ece3114
ecmm447
chns3000
math377
itd102
comp9444
comp(2041|9044)
econ0060
econ7230
mgt001371
ecs-323
cs6250
mgdi60012
mdia2012
comm221001
comm5000
ma1008
engl642
econ241
com333
math367
mis201
nbs-7041x
meek16104
econ2003
comm1190
mbas902
comp-1027
dpst1091
comp7315
eppd1033
m06
ee3025
msci231
bb113/bbs1063
fc709
comp3425
comp9417
econ42915
cb9101
math1102e
chme0017
fc307
mkt60104
5522usst
litr1-uc6201.200
ee1102
cosc2803
math39512
omp9727
int2067/int5051
bsb151
mgt253
fc021
babs2202
mis2002s
phya21
18-213
cege0012
mdia1002
math38032
mech5125
07
cisc102
mgx3110
cs240
11175
fin3020s
eco3420
ictten622
comp9727
cpt111
de114102d
mgm320h5s
bafi1019
math21112
efim20036
mn-3503
fins5568
110.807
bcpm000028
info6030
bma0092
bcpm0054
math20212
ce335
cs365
cenv6141
ftec5580
math2010
ec3450
comm1170
ecmt1010
csci-ua.0480-003
econ12-200
ib3960
ectb60h3f
cs247—assignment
tk3163
ics3u
ib3j80
comp20008
comp9334
eppd1063
acct2343
cct109
isys1055/3412
math350-real
math2014
eec180
stat141b
econ2101
msinm014/msing014/msing014b
fit2004
comp643
bu1002
cm2030
联系我们
- QQ: 99515681 微信:codinghelp
© 2024
www.7daixie.com
站长地图
程序辅导网!