spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Re: pyspark.sql.functions ide friendly
Date Wed, 17 Apr 2019 11:35:53 GMT
I use IntelliJ and have never seen an issue parsing the pyspark
functions... you're just saying the linter has an optional inspection
to flag it? just disable that?
I don't think we want to complicate the Spark code just for this. They
are declared at runtime for a reason.

On Wed, Apr 17, 2019 at 6:27 AM educhana@gmail.com <educhana@gmail.com> wrote:
>
> Hi,
>
> I'm aware of various workarounds to make this work smoothly in various IDEs, but wouldn't
better to solve the root cause?
>
> I've seen the code and don't see anything that requires such level of dynamic code, the
translation is 99% trivial.
>
> On 2019/04/16 12:16:41, 880f0464 <880f0464@protonmail.com.INVALID> wrote:
> > Hi.
> >
> > That's a problem with Spark as such and in general can be addressed on IDE to IDE
basis - see for example https://stackoverflow.com/q/40163106 for some hints.
> >
> >
> > Sent with ProtonMail Secure Email.
> >
> > ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
> > On Tuesday, April 16, 2019 2:10 PM, educhana <educhana@gmail.com> wrote:
> >
> > > Hi,
> > >
> > > Currently using pyspark.sql.functions from an IDE like PyCharm is causing
> > > the linters complain due to the functions being declared at runtime.
> > >
> > > Would a PR fixing this be welcomed? Is there any problems/difficulties I'm
> > > unaware?
> > >
> > >
> > > ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> > >
> > > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
> > >
> > > ----------------------------------------------------------------------
> > >
> > > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Mime
View raw message