How can I tell when Foundry has updated Spark?

There are very useful Spark features that are not yet available in the Spark version provided automatically by Foundry.

Spark upgrades don’t seem to be announced on the Foundry Releases page.

There doesn’t seem to be a way to view information about the latest Foundry libraries e.g. transforms, and see what the pyspark dependency is.

The only way to find out seems to be trying to upgrade a repo:

  • Use the Code Repositories Upgrade function to generate a Pull Request
  • Check the pyspark version in transforms-python/conda-versions.run.linux-64.lock.

Which has the side-effect of creating a meaningless Pull Request, so isn’t a good option.

Question originally asked by user5233494 on Stack Overflow: How can I tell when Foundry has updated Spark? - Stack Overflow

Instead of creating a dummy PR, you could create a new repo and check conda-versions.run.linux-64.lock. Then delete that repo.

This isn’t much better answer than what you already do, but at least this lets you avoid the extra PR.

Answer originally provided by Ontologize on Stack Overflow: How can I tell when Foundry has updated Spark? - Stack Overflow