-
Notifications
You must be signed in to change notification settings - Fork 300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pyspark kernel created using sparkmagic is not showing in the kernel list of jupyter extension in vs code #8286
Comments
Thanks for the bug. This isn't working because the 'python' specified by the kernelspec can't be found so we don't know how to launch it. This code here is the culprit: Line 168 in 3f523db
Kernel spec looks like this: {
"argv": [
"python", <-- This is ambiguous so we cannot tell how to start this kernel
"-m",
"sparkmagic.kernels.pysparkkernel.pysparkkernel",
"-f",
"{connection_file}"
],
"display_name": "PySpark",
"language": "python"
} We might be able to change this to look for an interpreter that has that module installed (and then we can be sure that launching it will work). Or you can change the kernelspec created such that it points to the correct virtual environment that has spark magic installed. For me it works if I change it to this: {
"argv": [
"D:\\Source\\Testing_SparkMagic\\.venv\\Scripts\\python.exe", <-- Full path to python from venv that has sparkmagic installed
"-m",
"sparkmagic.kernels.pysparkkernel.pysparkkernel",
"-f",
"{connection_file}"
],
"display_name": "PySpark",
"language": "python"
} You might also log a request on sparkmagic to have their kernelspec have the full path to python in it. |
It worked for me. |
One idea. List all the kernels found, but if it's ambiguous as to which interpreter to use, then ask the user to choose which one to use. |
I'm closing this issue as we've resolved the issues around displaying custom kernels (in the latest dev branch, soon to be released). |
@DonJayamanne, I still faced this issue in this version: This was missing until I updated "python" to the full path
Interestingly, this works fine without it:
|
If anyone is wondering where to find the kernelspec: |
Environment data
Expected behaviour
All kernel visible/working in Conda Jupyter Notebook should be the same in VS code jupyter extension
Actual behaviour
pyspark kernel installed using sparkmagic did not show in vs code jupyter extension kernel list, even it worked well with Conda Jupyter Notebook and it showed with command of
jupyter kernelspec list
.Steps to reproduce:
[NOTE: Self-contained, minimal reproducing code samples are extremely helpful and will expedite addressing your issue]
Logs
Output for
Jupyter
in theOutput
panel (View
→Output
, change the drop-down the upper-right of theOutput
panel toJupyter
)The text was updated successfully, but these errors were encountered: