[ https://issues.apache.org/jira/browse/SPARK-23878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16428494#comment-16428494
]
Andrew Davidson commented on SPARK-23878:
-----------------------------------------
Hi Hyukjin
many thanks! I am not a python expert. I did not know about dynamic name spaces. Did a little
googling
to configure dynamic namespace in eclipse
preferences ->pydev -> interpreter ->python interpreters
add pyspark to 'forced build'
I attached a screen shot. Any idea how this can be added to the documentation so others do
not have to waste times on this detail
p.s. you sent me a link to "support vitualenv in PySpark" https://issues.apache.org/jira/browse/SPARK-13587
Once I made the pydev configuration change I am able to use pyspark in a virtualenv. I use
this environment in eclipse pydev IDE with out any problems. I am able to run Juypter notebooks
with out an problem
> unable to import col() or lit()
> -------------------------------
>
> Key: SPARK-23878
> URL: https://issues.apache.org/jira/browse/SPARK-23878
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 2.3.0
> Environment: eclipse 4.7.3
> pyDev 6.3.2
> pyspark==2.3.0
> Reporter: Andrew Davidson
> Priority: Major
> Attachments: eclipsePyDevPySparkConfig.png
>
>
> I have some code I am moving from a jupyter notebook to separate python modules. My notebook
uses col() and list() and works fine
> when I try to work with module files in my IDE I get the following errors. I am also
not able to run my unit tests.
> {color:#FF0000}Description Resource Path Location Type{color}
> {color:#FF0000}Unresolved import: lit load.py /adt_pyDevProj/src/automatedDataTranslation
line 22 PyDev Problem{color}
> {color:#FF0000}Description Resource Path Location Type{color}
> {color:#FF0000}Unresolved import: col load.py /adt_pyDevProj/src/automatedDataTranslation
line 21 PyDev Problem{color}
> I suspect that when you run pyspark it is generating the col and lit functions?
> I found a discription of the problem @ [https://stackoverflow.com/questions/40163106/cannot-find-col-function-in-pyspark] I
do not understand how to make this work in my IDE. I am not running pyspark just an editor
> is there some sort of workaround or replacement for these missing functions?
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|