from pyspark.sql import SparkSession, HiveContext
_SPARK_HOST = “spark://spark-master:7077” _APP_NAME = “test”
spark = SparkSession.builder.master(_SPARK_HOST).appName(_APP_NAME).getOrCreate()
data = [ (1,”3”,”145”), (1,”4”,”146”), (1,”5”,”25”), (1,”6”,”26”), (2,”32”,”32”), (2,”8”,”134”), (2,”8”,”134”), (2,”9”,”137”) ] df = spark.createDataFrame(data, [‘id’, “test_id”, ‘camera_id’])
df.registerTempTable(‘test_hive’) sqlContext.sql(“create table default.write_test select * from test_hive”) https://blog.csdn.net/u011412768/article/details/93426353
go-event是一个在Docker项目中使用到的一个事件分发组件,实现了常规的广播,队列等事件分发模型,代码简洁明了,也适合初学者对Go语言的入门,对channel用来同步,通信也会加深理解。
运行composer preg_match(): Allocation of JIT memory failed, PCRE JIT will be disabled.