积分充值
 首页
前端开发
AngularDartElectronFlutterHTML/CSSJavaScriptReactSvelteTypeScriptVue.js构建工具
后端开发
.NetC#C++C语言DenoffmpegGoIdrisJavaJuliaKotlinLeanMakefilenimNode.jsPascalPHPPythonRISC-VRubyRustSwiftUML其它语言区块链开发测试微服务敏捷开发架构设计汇编语言
数据库
Apache DorisApache HBaseCassandraClickHouseFirebirdGreenplumMongoDBMySQLPieCloudDBPostgreSQLRedisSQLSQLiteTiDBVitess数据库中间件数据库工具数据库设计
系统运维
AndroidDevOpshttpdJenkinsLinuxPrometheusTraefikZabbix存储网络与安全
云计算&大数据
Apache APISIXApache FlinkApache KarafApache KyuubiApache OzonedaprDockerHadoopHarborIstioKubernetesOpenShiftPandasrancherRocketMQServerlessService MeshVirtualBoxVMWare云原生CNCF机器学习边缘计算
综合其他
BlenderGIMPKiCadKritaWeblate产品与服务人工智能亿图数据可视化版本控制笔试面试
文库资料
前端
AngularAnt DesignBabelBootstrapChart.jsCSS3EchartsElectronHighchartsHTML/CSSHTML5JavaScriptJerryScriptJestReactSassTypeScriptVue前端工具小程序
后端
.NETApacheC/C++C#CMakeCrystalDartDenoDjangoDubboErlangFastifyFlaskGinGoGoFrameGuzzleIrisJavaJuliaLispLLVMLuaMatplotlibMicronautnimNode.jsPerlPHPPythonQtRPCRubyRustR语言ScalaShellVlangwasmYewZephirZig算法
移动端
AndroidAPP工具FlutterFramework7HarmonyHippyIoniciOSkotlinNativeObject-CPWAReactSwiftuni-appWeex
数据库
ApacheArangoDBCassandraClickHouseCouchDBCrateDBDB2DocumentDBDorisDragonflyDBEdgeDBetcdFirebirdGaussDBGraphGreenPlumHStreamDBHugeGraphimmudbIndexedDBInfluxDBIoTDBKey-ValueKitDBLevelDBM3DBMatrixOneMilvusMongoDBMySQLNavicatNebulaNewSQLNoSQLOceanBaseOpenTSDBOracleOrientDBPostgreSQLPrestoDBQuestDBRedisRocksDBSequoiaDBServerSkytableSQLSQLiteTiDBTiKVTimescaleDBYugabyteDB关系型数据库数据库数据库ORM数据库中间件数据库工具时序数据库
云计算&大数据
ActiveMQAerakiAgentAlluxioAntreaApacheApache APISIXAPISIXBFEBitBookKeeperChaosChoerodonCiliumCloudStackConsulDaprDataEaseDC/OSDockerDrillDruidElasticJobElasticSearchEnvoyErdaFlinkFluentGrafanaHadoopHarborHelmHudiInLongKafkaKnativeKongKubeCubeKubeEdgeKubeflowKubeOperatorKubernetesKubeSphereKubeVelaKumaKylinLibcloudLinkerdLonghornMeiliSearchMeshNacosNATSOKDOpenOpenEBSOpenKruiseOpenPitrixOpenSearchOpenStackOpenTracingOzonePaddlePaddlePolicyPulsarPyTorchRainbondRancherRediSearchScikit-learnServerlessShardingSphereShenYuSparkStormSupersetXuperChainZadig云原生CNCF人工智能区块链数据挖掘机器学习深度学习算法工程边缘计算
UI&美工&设计
BlenderKritaSketchUI设计
网络&系统&运维
AnsibleApacheAWKCeleryCephCI/CDCurveDevOpsGoCDHAProxyIstioJenkinsJumpServerLinuxMacNginxOpenRestyPrometheusServertraefikTrafficUnixWindowsZabbixZipkin安全防护系统内核网络运维监控
综合其它
文章资讯
 上传文档  发布文章  登录账户
IT文库
  • 综合
  • 文档
  • 文章

无数据

分类

全部云计算&大数据(21)Apache Flink(21)

语言

全部英语(19)中文(简体)(2)

格式

全部PDF文档 PDF(21)
 
本次搜索耗时 0.014 秒,为您找到相关结果约 21 个.
  • 全部
  • 云计算&大数据
  • Apache Flink
  • 全部
  • 英语
  • 中文(简体)
  • 全部
  • PDF文档 PDF
  • 默认排序
  • 最新排序
  • 页数排序
  • 大小排序
  • 全部时间
  • 最近一天
  • 最近一周
  • 最近一个月
  • 最近三个月
  • 最近半年
  • 最近一年
  • pdf文档 Course introduction - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    the day of the respective deadline. • Late submissions are only eligible for up to 50% of the original score. 15 Vasiliki Kalavri | Boston University 2020 Quiz #0 Vasiliki Kalavri | Boston University bank-apache-flink 24 Vasiliki Kalavri | Boston University 2020 Call monitoring • Service monitoring, e.g. source and destination phone numbers, their first and last cell towers Examples: • Location-based
    0 码力 | 34 页 | 2.53 MB | 1 年前
    3
  • pdf文档 PyFlink 1.15 Documentation

    1.1 Preparation This page shows you how to install PyFlink using pip, conda, installing from the source, etc. Python Version Supported PyFlink Version Python Version Supported PyFlink 1.16 Python 3 virtual environment needs to be activated before to use it. To activate the virtual environment, run: source venv/bin/activate That is, execute the activate script under the bin directory of your virtual environment miniconda.sh # install miniconda ./miniconda.sh -b -p miniconda # Activate the miniconda environment source miniconda/bin/activate # Create conda virtual environment under a directory, e.g. venv conda create
    0 码力 | 36 页 | 266.77 KB | 1 年前
    3
  • pdf文档 PyFlink 1.16 Documentation

    1.1 Preparation This page shows you how to install PyFlink using pip, conda, installing from the source, etc. Python Version Supported PyFlink Version Python Version Supported PyFlink 1.16 Python 3 virtual environment needs to be activated before to use it. To activate the virtual environment, run: source venv/bin/activate That is, execute the activate script under the bin directory of your virtual environment miniconda.sh # install miniconda ./miniconda.sh -b -p miniconda # Activate the miniconda environment source miniconda/bin/activate # Create conda virtual environment under a directory, e.g. venv conda create
    0 码力 | 36 页 | 266.80 KB | 1 年前
    3
  • pdf文档 Streaming optimizations - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    covered in this lecture ??? Vasiliki Kalavri | Boston University 2020 Revisiting the basics 3 source sink input port output port dataflow graph ??? Vasiliki Kalavri | Boston University 2020 Revisiting operators according to the number of available cores / threads • Fused operators can share the address space but use separate threads of control • avoid communication cost without losing pipeline pipeline parallelism • use a shared buffer for communication • Fused filters / projections at the source can significantly reduce I/O and intermediate results size Synergies with scheduling and other optimizations
    0 码力 | 54 页 | 2.83 MB | 1 年前
    3
  • pdf文档 Cardinality and frequency estimation - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    stream into m = 2p = 4 sub-streams. Consider the input elements {5, 14, 5, 2, 8, 1, …} Substream Address Counter S0 00 S1 01 S2 10 S3 11 ??? Vasiliki Kalavri | Boston University 2020 11 Stochastic stream into m = 2p = 4 sub-streams. Consider the input elements {5, 14, 5, 2, 8, 1, …} Substream Address Counter S0 00 S1 01 S2 10 S3 11 • x1=5, h5(5) = 00101 • x2=14, h5(14) = 10110 • x3=5 stream into m = 2p = 4 sub-streams. Consider the input elements {5, 14, 5, 2, 8, 1, …} Substream Address Counter S0 00 S1 01 S2 10 S3 11 • x1=5, h5(5) = 00101 • x2=14, h5(14) = 10110 • x3=5
    0 码力 | 69 页 | 630.01 KB | 1 年前
    3
  • pdf文档 Fault-tolerance demo & reconfiguration - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    high-availability mode migrates the responsibility and metadata for a job to another JobManager in case the original JobManager disappears. • Flink relies on Apache ZooKeeper for high-availability • coordination
    0 码力 | 41 页 | 4.09 MB | 1 年前
    3
  • pdf文档 Filtering and sampling streams - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    What data structure would you use to: • Filter out all emails that are sent from a suspected spam address? • Filter out all URLs that contain malware? • Filter out all compromised passwords? • Remove What data structure would you use to: • Filter out all emails that are sent from a suspected spam address? • Filter out all URLs that contain malware? • Filter out all compromised passwords? • Remove
    0 码力 | 74 页 | 1.06 MB | 1 年前
    3
  • pdf文档 High-availability, recovery semantics, and guarantees - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    Vasiliki Kalavri | Boston University 2020 Exactly-once in Google Cloud Dataflow Checkpointing to address non-determinism • Each output is checkpointed together with its unique ID to stable storage before
    0 码力 | 49 页 | 2.08 MB | 1 年前
    3
  • pdf文档 Skew mitigation - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    cause imbalance w2 w1 w3 ??? Vasiliki Kalavri | Boston University 2020 Addressing skew • To address skew, the system needs to track the frequencies of the partitioning key values. • We can then
    0 码力 | 31 页 | 1.47 MB | 1 年前
    3
  • pdf文档 Scalable Stream Processing - Spark Streaming and Flink

    Operations ▶ Every input DStream is associated with a Receiver object. • It receives the data from a source and stores it in Spark’s memory for processing. ▶ Three categories of streaming sources: 1. Basic Operations ▶ Every input DStream is associated with a Receiver object. • It receives the data from a source and stores it in Spark’s memory for processing. ▶ Three categories of streaming sources: 1. Basic [number of partitions]) 15 / 79 Input Operations - Custom Sources (1/3) ▶ To create a custom source: extend the Receiver class. ▶ Implement onStart() and onStop(). ▶ Call store(data) to store received
    0 码力 | 113 页 | 1.22 MB | 1 年前
    3
共 21 条
  • 1
  • 2
  • 3
前往
页
相关搜索词
CourseintroductionCS591K1DataStreamProcessingandAnalyticsSpring2020PyFlink1.15Documentation1.16StreamingoptimizationsCardinalityfrequencyestimationFaulttolerancedemoreconfigurationFilteringsamplingstreamsHighavailabilityrecoverysemanticsguaranteesSkewmitigationScalableSpark
IT文库
关于我们 文库协议 联系我们 意见反馈 免责声明
本站文档数据由用户上传或本站整理自互联网,不以营利为目的,供所有人免费下载和学习使用。如侵犯您的权益,请联系我们进行删除。
IT文库 ©1024 - 2025 | 站点地图
Powered By MOREDOC AI v3.3.0-beta.70
  • 关注我们的公众号【刻舟求荐】,给您不一样的精彩
    关注我们的公众号【刻舟求荐】,给您不一样的精彩