• 2 Posts
  • 19 Comments
Joined 1 year ago
cake
Cake day: August 6th, 2023

help-circle
  • Hi, I am a building a platform with the goal of supporting apps like this, and I would be interested to develop a plugin for your use-case as an experiment (no fee).

    I am working alone on this and this is not my first priority, so I cannot make any guarantees about the timeline, or the scope of the plugin. But, if you are interested we can have a chat on matrix.

    The project is not open source yet, but I am planning on doing so once (a) I figure out how to properly apply licensing, and (b) remove any potentially critical information (credentials) from the repository.






  • I am definitely guilt for that, but I find this approach really productive. We use small bug fixes as an opportunity to improve the code quality. Bigger PRs often introduce new features and take a lot of time, you know the other person is tired and needs to move on, so we focus on the bigger picture, requesting changes only if there is a bug or an important structural issue.


  • While I agree with most of what you say, I have a personal anecdote that highlights the importance of performance as a feature.

    I have a friend that studies economics and uses python for his day to day. Since computer science is not his domain, he finds it difficult to optimize his code, and learning a new language (C in this case) is not really an option.

    Some of his experiments take days to run, and this is becoming a major bottleneck in his workflow. Being able to write faster code without relying on C is going to have a significant impact on his research.

    Of course, there are other ways to achieve similar results, for example another friend is working on DIAS a framework that optimizes pandas in the runtime. But, the point still stands, there are a tonne of researchers relying on python to get quick and dirty results, and performance plays a significant in that when the load of data is huge.









  • twelve factor app

    Great resource!

    Write database migrations in both directions so people can downgrade on failures.

    Good point. Personally, I take backups before upgrades and restore if anything goes wrong. But, I understand how downgrading sometimes is just easier.

    I have trouble coming up with a migration procedure that makes sense to me. I have the following in mind:

    1. Provide init scripts that produce a schema that matches beginning state of the current major.
    2. Provide major to major migration scripts.
    3. For every major, provide minor to minor migration scripts.
    4. Schema changes require at least a minor release.

    Make it possible to configure your system via ENV variables, ENV files and config files.

    I am bit worried about this one, environment variables can be a security concern. Specifically, I am not sure if I should allow providing secrets (like db connection strings) through environment variables. I am inclined to let people do what they want to, but issue a warning.

    Make it possible to disable authentication to add Authelia or LDAP through the webserver. Make clear that this is only to be used for external authentication.

    I am considering adding support for oauth through keycloak. My assumption is that if you are going to host your own LDAP, you can probably configure keycloak too. Do you think that makes sense?

    Make it possible to run multiple parallel instances of your software without affecting the database consistency, e.g. for high availability or horizontal scaling.

    Ideally, an instance shouldn’t be big enough to need it. I know, famous last words, but in my case I think it’s a bad problem to have. I am going out of scope, but I am wondering where is the line between discouraging large scale deployments and designing something pre-destined to obscurity.

    Telemetry

    Not even on my radar, thanks for bringing it into my attention 🙏




  • WOW! https://github.com/modularml/mojo

    Been looking for something like this, thanks a lot!!!

    Edit: Had a quick look at the docs. Mojo’s initial build was published Sep2022, it’s fairly young, but seems to be getting a lot of attention (on GitHub they have the same number of stars as mypy 🤯).

    For anyone interested, their roadmap is an interested read. They seem to be taking a step-by-step approach, trying first to nail down core features first before moving to stuff like python inter-op and syntactic sugar.

    Mojo still doesn’t support classes, the primary thing Python programmers use pervasively! This isn’t because we hate dynamism - quite the opposite. It is because we need to get the core language semantics nailed down before adding them. We expect to provide full support for all the dynamic features in Python classes, and want the right framework to hang that off of.

    The “why mojo” section give a lot of background too. They are implementing an ML-IR compiler, which is really promising for optimization (think all the goodies we could use from LLVM).



  • this is a method, and always was a method, I just wanted it to look like an attribute for aesthetic reasons

    I think “aesthetic reasons” is an oversimplification. There are certain assumptions a developer makes when reading some code that uses properties. While these assumptions are not clearly defined and may differ per developer, I think there is a common core.

    (1) There are no side-effects. The object is not mutated (or any other object), no IO takes place.

    (2) The time and space complexity is O(1).

    (3) The result is consistent. Consequent calls to the property should return the same value unless there is a mutation between them.