本文介绍了跨多个笔记本的Databricks SQL Server连接的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我找到了一些,说明如何在pySpark databricks笔记本之间传递变量。我很好奇我们是否可以通过SQL Server连接,例如在Notebook A中拥有主机/数据库/端口/用户/ pw并在Notebook B中调用连接。

I found some resources for how to pass variables across pySpark databricks notebooks. I'm curious if we can pass SQL Server connection, such as having host/database/port/user/pw in Notebook A and calling the connection on Notebook B.

推荐答案

看看Databricks文档的那部分:。
这样,您可以在多个笔记本之间传递一个或多个字符串,但是您必须在笔记本B中手动创建连接。

Take a look at that part of Databricks documentation: https://docs.databricks.com/notebooks/notebook-workflows.html#pass-structured-data.This way you can pass strings, one or multiple, across notebooks, but you'll have to create the connection in Notebook B manually.

其他选项-创建Notebook A,创建连接变量,并在执行Notebook B中的某些代码之前运行它(更多详细信息-)。基本上,您需要一个具有以下代码的单元格:

Other option - create Notebook A, that creates a connection variable, and "run" it before executing some code in Notebook B (more details here - https://forums.databricks.com/questions/154/can-i-run-one-notebook-from-another-notebook.html). Basically, you need a cell with code:

%run path/to/notebookA

这篇关于跨多个笔记本的Databricks SQL Server连接的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-28 05:36