Having a lot of joins can be expensive and non-performant.
Only if you don’t know how to do indexing properly. Normalized data is more performant (less duplication of data, less memory and bandwidth is being used) if you know how to index.
It may have been true decades ago that denormalized tables were more performant, I don’t know. But today it’s far more common that the phrase “denormalized tables are more performant” is something that’s said by someone that sucks at indexing and/or is just being lazy.
But I do put JSON into tables sometimes when the data is going to be very inconsistent between different items and there’s no need to index any of the values in there. Like if different vendors provide different kinds of information about their products, I need to store it somewhere, so just serialize it and put it in there to be read by a program that has abstraction layers to deal with it. It’s never going to perform well if I do a query on it, but if all that’s needed is to display details on one item at a time, it’s fine.
But if someone got that hashed version they could hack the client to have client side hashing code just send that hashed value to the server. You’d want to have the server to send a rotating token of some sort to use for encrypting the password on the client and then validate it on the server side that it was encrypted with the same token the server sent.
Seems complicated to me… https is probably has good enough encryption, so eh, whatever.