1. Flume overview
Flume is a distributed system for massive log collection, aggregation and transmission. Flume's main function is to read the data of the server's local disk in real time and write the data to HDFS.
Agent: send data from Source to destination in the form of events. Including Source, ...
Posted on Thu, 04 Jun 2020 10:59:09 -0700 by jamz310
Spark on MaxCompute has access to instances (e.g. ECS, HBase, RDS) within the VPC in the Ali cloud. The default MaxCompute underlying network is isolated from the external network, and Spark on MaxCompute provides a solution through configurationSpark.hadoop.odps.Cupid.vpc.domain.list to access the Hbase of Ali Cloud's VPC network e ...
Posted on Mon, 01 Jun 2020 23:53:10 -0700 by Assorro
< 1. Start service
a) start-all.sh start various hadoop services (verify with this $jsp)
b) zkServer.sh start starts zookeeper (these two are used to start the Hbase database service)
c) Start hbase.sh start HBase
e) Systemctl stop firewall d.service
f) hbase she ...
Posted on Sun, 01 Mar 2020 04:57:07 -0800 by php_coder_dvo
from pyspark import SparkContext
from pyspark import SparkConf
The former function in aggregateByKey is a function calculated in each partition, and the latter fun ...
Posted on Sat, 22 Feb 2020 01:35:39 -0800 by YOUAREtehSCENE
scala> var textFile = sc.textFile("file:///root/1.txt")
textFile: org.apache.spark.rdd.RDD[String] = file:///root/1.txt MapPartitionsRDD at textFile at <console>:24
Posted on Wed, 29 Jan 2020 05:34:55 -0800 by dizel247
1, Hbase Connection
1. What kind of connection is created by
org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(Configuration, ExecutorService, User)
//Initialized by ConnectionImplementation by default
String className = conf.get(ClusterConnection.HBASE_CLIENT_CONNECTION_I ...
Posted on Mon, 20 Jan 2020 07:51:08 -0800 by manianprasanna
I. introduce Mavan dependency
<!-- https://mvnrepository.com/artifact/org.apache.phoenix/phoenix-core -->
2. Establish J ...
Posted on Fri, 13 Dec 2019 11:21:25 -0800 by god_zun
What needs can the paging filter address?
Chestnut: divide the table into multiple pages, 5 lines per page. Now you need to query all the information on page 3
1. Target: determine the content to be queried by entering page number and number of lines
2. The starting output position can be controlled by setStartRow met ...
Posted on Tue, 10 Dec 2019 13:31:58 -0800 by Braveheart
1. shell operation
[root@hadoop01 ~]# hbase shell #Enter HBASE client
hbase(main):001:0> help "dml" #Get a group of command prompts HBase (main): 001:0 > help "put" get a single command prompt help
hbase(main):001:0> exit #Exit client
#View all tables in hbase
Posted on Mon, 09 Dec 2019 10:46:36 -0800 by corsc
originAfter the Pinpoint access service monitoring, the data volume soars, with an average daily HBase data increment of about 20G. The data volume is too large, and the data needs to be cleaned up regularly, otherwise the monitoring availability will be reduced. Because the previous e ...
Posted on Wed, 20 Nov 2019 12:13:25 -0800 by landung