积分充值
 首页
前端开发
AngularDartElectronFlutterHTML/CSSJavaScriptReactSvelteTypeScriptVue.js构建工具
后端开发
.NetC#C++C语言DenoffmpegGoIdrisJavaJuliaKotlinLeanMakefilenimNode.jsPascalPHPPythonRISC-VRubyRustSwiftUML其它语言区块链开发测试微服务敏捷开发架构设计汇编语言
数据库
Apache DorisApache HBaseCassandraClickHouseFirebirdGreenplumMongoDBMySQLPieCloudDBPostgreSQLRedisSQLSQLiteTiDBVitess数据库中间件数据库工具数据库设计
系统运维
AndroidDevOpshttpdJenkinsLinuxPrometheusTraefikZabbix存储网络与安全
云计算&大数据
Apache APISIXApache FlinkApache KarafApache KyuubiApache OzonedaprDockerHadoopHarborIstioKubernetesOpenShiftPandasrancherRocketMQServerlessService MeshVirtualBoxVMWare云原生CNCF机器学习边缘计算
综合其他
BlenderGIMPKiCadKritaWeblate产品与服务人工智能亿图数据可视化版本控制笔试面试
文库资料
前端
AngularAnt DesignBabelBootstrapChart.jsCSS3EchartsElectronHighchartsHTML/CSSHTML5JavaScriptJerryScriptJestReactSassTypeScriptVue前端工具小程序
后端
.NETApacheC/C++C#CMakeCrystalDartDenoDjangoDubboErlangFastifyFlaskGinGoGoFrameGuzzleIrisJavaJuliaLispLLVMLuaMatplotlibMicronautnimNode.jsPerlPHPPythonQtRPCRubyRustR语言ScalaShellVlangwasmYewZephirZig算法
移动端
AndroidAPP工具FlutterFramework7HarmonyHippyIoniciOSkotlinNativeObject-CPWAReactSwiftuni-appWeex
数据库
ApacheArangoDBCassandraClickHouseCouchDBCrateDBDB2DocumentDBDorisDragonflyDBEdgeDBetcdFirebirdGaussDBGraphGreenPlumHStreamDBHugeGraphimmudbIndexedDBInfluxDBIoTDBKey-ValueKitDBLevelDBM3DBMatrixOneMilvusMongoDBMySQLNavicatNebulaNewSQLNoSQLOceanBaseOpenTSDBOracleOrientDBPostgreSQLPrestoDBQuestDBRedisRocksDBSequoiaDBServerSkytableSQLSQLiteTiDBTiKVTimescaleDBYugabyteDB关系型数据库数据库数据库ORM数据库中间件数据库工具时序数据库
云计算&大数据
ActiveMQAerakiAgentAlluxioAntreaApacheApache APISIXAPISIXBFEBitBookKeeperChaosChoerodonCiliumCloudStackConsulDaprDataEaseDC/OSDockerDrillDruidElasticJobElasticSearchEnvoyErdaFlinkFluentGrafanaHadoopHarborHelmHudiInLongKafkaKnativeKongKubeCubeKubeEdgeKubeflowKubeOperatorKubernetesKubeSphereKubeVelaKumaKylinLibcloudLinkerdLonghornMeiliSearchMeshNacosNATSOKDOpenOpenEBSOpenKruiseOpenPitrixOpenSearchOpenStackOpenTracingOzonePaddlePaddlePolicyPulsarPyTorchRainbondRancherRediSearchScikit-learnServerlessShardingSphereShenYuSparkStormSupersetXuperChainZadig云原生CNCF人工智能区块链数据挖掘机器学习深度学习算法工程边缘计算
UI&美工&设计
BlenderKritaSketchUI设计
网络&系统&运维
AnsibleApacheAWKCeleryCephCI/CDCurveDevOpsGoCDHAProxyIstioJenkinsJumpServerLinuxMacNginxOpenRestyPrometheusServertraefikTrafficUnixWindowsZabbixZipkin安全防护系统内核网络运维监控
综合其它
文章资讯
 上传文档  发布文章  登录账户
IT文库
  • 综合
  • 文档
  • 文章

无数据

分类

全部云计算&大数据(11)Apache Flink(11)

语言

全部英语(11)

格式

全部PDF文档 PDF(11)
 
本次搜索耗时 0.017 秒,为您找到相关结果约 11 个.
  • 全部
  • 云计算&大数据
  • Apache Flink
  • 全部
  • 英语
  • 全部
  • PDF文档 PDF
  • 默认排序
  • 最新排序
  • 页数排序
  • 大小排序
  • 全部时间
  • 最近一天
  • 最近一周
  • 最近一个月
  • 最近三个月
  • 最近半年
  • 最近一年
  • pdf文档 Streaming optimizations - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    Operator selectivity 6 • The number of output elements produced per number of input elements • a map operator has a selectivity of 1, i.e. it produces one output element for each input element it processes merge X merge A A X merge A1 merge A2 A2 A1 X X ??? Vasiliki Kalavri | Boston University 2020 map(String key, String value): // key: document name // value: document contents for each URL → list(v2) (k1, v1) → list(k2, v2) map() reduce() 25 ??? Vasiliki Kalavri | Boston University 2020 MapReduce combiners example: URL access frequency 26 map() reduce() GET /dumprequest HTTP/1
    0 码力 | 54 页 | 2.83 MB | 1 年前
    3
  • pdf文档 Scalable Stream Processing - Spark Streaming and Flink

    20 / 79 Transformations (2/4) ▶ map • Returns a new DStream by passing each element of the source DStream through a given function. ▶ flatMap • Similar to map, but each input item can be mapped to 21 / 79 Transformations (2/4) ▶ map • Returns a new DStream by passing each element of the source DStream through a given function. ▶ flatMap • Similar to map, but each input item can be mapped to 21 / 79 Transformations (2/4) ▶ map • Returns a new DStream by passing each element of the source DStream through a given function. ▶ flatMap • Similar to map, but each input item can be mapped to
    0 码力 | 113 页 | 1.22 MB | 1 年前
    3
  • pdf文档 Introduction to Apache Flink and Apache Kafka - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    Kalavri | Boston University 2020 Streaming word count textStream .flatMap {_.split("\\W+")} .map {(_, 1)} .keyBy(0) .sum(1) .print() “live and let live” “live” “and” “let” “live” (live nt
 val sensorData = env.addSource(new SensorSource)
 val maxTemp = sensorData
 .map(r => Reading(r.id,r.time,(r.temp-32)*(5.0/9.0)))
 .keyBy(_.id)
 .max("temp")
 maxTemp nt
 val sensorData = env.addSource(new SensorSource)
 val maxTemp = sensorData
 .map(r => Reading(r.id,r.time,(r.temp-32)*(5.0/9.0)))
 .keyBy(_.id)
 .max("temp")
 maxTemp
    0 码力 | 26 页 | 3.33 MB | 1 年前
    3
  • pdf文档 PyFlink 1.15 Documentation

    FIELD("data", DataTypes.STRING())])) def func(data: Row): return Row(data.id, data.data * 2) table.map(func).execute().print() +----+----------------------+--------------------------------+ | op | id | class MyFlatMapFunction(FlatMapFunction): def flat_map(self, value): for s in str(value.data).split('|'): yield Row(value.id, s) list(ds.flat_map(MyFlatMapFunction(), output_type=Types.ROW([Types.INT() part_size=1024 ** 3, rollover_interval=15 * 60 * 1000, inactivity_interval=5 *␣ ˓→60 * 1000)) .build()) ds.map(lambda i: (i[0] + 1, i[1]), Types.TUPLE([Types.INT(), Types.STRING()])).sink_ ˓→to(sink) # the result
    0 码力 | 36 页 | 266.77 KB | 1 年前
    3
  • pdf文档 PyFlink 1.16 Documentation

    FIELD("data", DataTypes.STRING())])) def func(data: Row): return Row(data.id, data.data * 2) table.map(func).execute().print() +----+----------------------+--------------------------------+ | op | id | class MyFlatMapFunction(FlatMapFunction): def flat_map(self, value): for s in str(value.data).split('|'): yield Row(value.id, s) list(ds.flat_map(MyFlatMapFunction(), output_type=Types.ROW([Types.INT() part_size=1024 ** 3, rollover_interval=15 * 60 * 1000, inactivity_interval=5 *␣ ˓→60 * 1000)) .build()) ds.map(lambda i: (i[0] + 1, i[1]), Types.TUPLE([Types.INT(), Types.STRING()])).sink_ ˓→to(sink) # the result
    0 码力 | 36 页 | 266.80 KB | 1 年前
    3
  • pdf文档 Streaming in Apache Flink

    } } Map Function DataStream rides = env.addSource(new TaxiRideSource(...)); DataStream enrichedNYCRides = rides .filter(new RideCleansing.NYCFilter()) .map(new Enrichment()); class Enrichment implements MapFunction { @Override public EnrichedRide map(TaxiRide taxiRide) throws Exception { return new EnrichedRide(taxiRide); } } FlatMap Function DataStream> input = … DataStream> smoothed = input.keyBy(0).map(new Smoother()); public static class Smoother extends RichMapFunction, Tuple2
    0 码力 | 45 页 | 3.00 MB | 1 年前
    3
  • pdf文档 Cardinality and frequency estimation - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    to count cardinalities up to 1 billion or 230 with an accuracy of 4%. • The hash value needs to map elements to M = log2(230) = 30 bits. Space requirements ??? Vasiliki Kalavri | Boston University to count cardinalities up to 1 billion or 230 with an accuracy of 4%. • The hash value needs to map elements to M = log2(230) = 30 bits. • We need 1024 counters, so m = 210 and we need p = log2m = 10 to count cardinalities up to 1 billion or 230 with an accuracy of 4%. • The hash value needs to map elements to M = log2(230) = 30 bits. • We need 1024 counters, so m = 210 and we need p = log2m = 10
    0 码力 | 69 页 | 630.01 KB | 1 年前
    3
  • pdf文档 Windows and triggers - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    val sensorData = env.addSource(new SensorSource) 
 val maxTemp = sensorData
 .map(r => Reading(r.id,r.time,(r.temp-32)*(5.0/9.0)))
 .keyBy(_.id) .timeWindow(Time.minutes(1)) Kalavri | Boston University 2020 val minTempPerWindow: DataStream[(String, Double)] = sensorData .map(r => (r.id, r.temperature)) .keyBy(_._1) .timeWindow(Time.seconds(15)) .reduce((r1, r2) Kalavri | Boston University 2020 val avgTempPerWindow: DataStream[(String, Double)] = sensorData .map(r => (r.id, r.temperature)) .keyBy(_._1) .timeWindow(Time.seconds(15)) .aggregate(new AvgTempFunction)
    0 码力 | 35 页 | 444.84 KB | 1 年前
    3
  • pdf文档 State management - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    interface. What operations should state support? What state types can you think of? • Count, sum, list, map, … Vasiliki Kalavri | Boston University 2020 All data maintained by a task and used to compute results: List[T]) Flink’s state primitives 13 Vasiliki Kalavri | Boston University 2020 • MapState[K, V]: a map of keys and values • get(key: K), put(key: K, value: V), contains(key: K), remove(key: K) • iterators
    0 码力 | 24 页 | 914.13 KB | 1 年前
    3
  • pdf文档 Flow control and load shedding - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    selectivity 11 • Selectivity: how many records does the operator produce per record in its input? • map: 1 in 1 out • filter: 1 in, 1 or 0 out • flatMap, join: 1 in 0, 1, or more out • Cost: how many connected to multiple queries. 14 ??? Vasiliki Kalavri | Boston University 2020 Load Shedding Road Map (LSRM) • A pre-computed table that contains materialized load shedding plans ordered by how much
    0 码力 | 43 页 | 2.42 MB | 1 年前
    3
共 11 条
  • 1
  • 2
前往
页
相关搜索词
StreamingoptimizationsCS591K1DataStreamProcessingandAnalyticsSpring2020ScalableSparkFlinkIntroductiontoApacheKafkaPy1.15Documentation1.16inCardinalityfrequencyestimationWindowstriggersStatemanagementFlowcontrolloadshedding
IT文库
关于我们 文库协议 联系我们 意见反馈 免责声明
本站文档数据由用户上传或本站整理自互联网,不以营利为目的,供所有人免费下载和学习使用。如侵犯您的权益,请联系我们进行删除。
IT文库 ©1024 - 2025 | 站点地图
Powered By MOREDOC AI v3.3.0-beta.70
  • 关注我们的公众号【刻舟求荐】,给您不一样的精彩
    关注我们的公众号【刻舟求荐】,给您不一样的精彩