Skip to content

Conversation

@hakkelt
Copy link
Collaborator

@hakkelt hakkelt commented May 7, 2025

This is an admittedly large pull request with many new features and substantial changes. My main question is what your plans are with the package: Are you still intending to actively maintain it in the future, do you want to reserve it only for bugfixes, or are you open, maybe, to add me as a maintainer? I started to work heavily on this package and the connected packages (ProximalOperators.jl, ProximalAlgorithms.jl, StructuredOptimization.jl), so many substantial PRs are to be expected from my side. I can see 3 scenarios that might work out for me:

  1. You will review these pull requests later, I just need to be patient (even if you have capacity for the review, let's say, a couple of months from now).
  2. You add permissions to my GitHub account to merge PRs and review my changes only if you have time for that.
  3. I register my fork under a separate name.
    I hope you will have spare time for the cooperation, and I hope for either option 1 or 2. But I also need to proceed with my work (I work on this project as part of my PhD studies), so an answer for this would be helpful. :)

@hakkelt hakkelt marked this pull request as draft May 7, 2025 12:49
@hakkelt
Copy link
Collaborator Author

hakkelt commented May 19, 2025

@nantonel @lostella What are your thoughts?

@lostella
Copy link
Member

@hakkelt I don’t think I have the resources to keep maintaining this package. We could think of option 2 maybe?

@nantonel
Copy link
Member

nantonel commented Jun 6, 2025

Hey @hakkelt sorry for the late reply. I also don't have the bandwidth to review this PR and I think option 2 is fine with me. I'll grant you the permission!

@hakkelt
Copy link
Collaborator Author

hakkelt commented Jun 8, 2025

Thank you both!

1) interface changes: domainType -> domain_type, codomainType -> codomain_type
2) separate operators with dependencies (DSP, FFTW, NFFT, Wavelets)
3) expose levels in WaveletOperators
4) Fix errors and improve BatchOps
5) Improve performance with FastBroadcast's @.. macro and Polyester's @Batch
6) Fix errors in documentation
@hakkelt
Copy link
Collaborator Author

hakkelt commented Nov 5, 2025

This PR contains the following breaking changes:

  1. interface changes: domainType -> domain_type, codomainType -> codomain_type
  2. separate operators with dependencies to new packages inside the repo (DSPOperators, FFTWOperators) and add new operators in separate packages (NfftOperators, WaveletOperators) -- these packages are yet to be released
  3. Introduce multi threading to DiagOp and BroadCast
  4. Add option to control where scaling occurs in DFT operator: never (default), forward, backward (this basically is ifft), both (scale by square root of scaling factor in both directions) -- this follows the convention of numpy's FFT
  5. Define LinearAlgebra.opnorm for AbstractOperators. Add efficient implementation where possible (e.g. DiagOp, Eye, DFT), and use power iterations otherwise. Also estimate_opnorm is added that have efficient implementation for Compose (||A *B|| ≈ ||A|| * ||B||) and uses fewer iterations than opnorm as default. I'm not sure if it's a good idea in general, but seems to work well when estimating Lipsitz constant in fast forward backward algorithm.
  6. Optimize Compose and Scaling by fusing operators when possible (e.g. scaled DiagOp becomes a DiagOp with scaled weights, or two adjacent DiagOps gets fused in Compose)
  7. OperatorCore package is created in a new repo and it contains the general functions previously defined on properties.jl (e.g. is_linear, is_diagonal). Only those functions are moved to this package that can be defined for matrices. The purpose of this change in that OperatorCore would become a dependency of ProximalAlgorithms, so that each algorithm can define what the requirements for the inputs are.

Please, let me know if you have objections to any of these changes or need clarification. Otherwise I plan to merge this PR in a week or so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants