The CEO is a right wing trump worshiper.
Dig into the company’s tweet history, and find archived tweets that were deleted for PR/white-washing reasons.
Long history of this stuff.
The CEO is a right wing trump worshiper.
Dig into the company’s tweet history, and find archived tweets that were deleted for PR/white-washing reasons.
Long history of this stuff.
Never not UTC Everywhere.
The ecosystem is really it, C# as a language isn’t the best, objectively Typescript is a much more developer friendly and globally type safe (at design time) language. It’s far more versatile than C# in that regard, to the point where there is almost no comparison.
But holy hell the .Net ecosystem is light-years ahead, it’s so incredibly consistent across major versions, is extremely high quality, has consistent and well considered design advancements, and is absolutely bloody fast. Tie that in with first party frameworks that cover most of all major needs, and it all works together so smoothly, at least for web dev.
The designers as seen by designers is so right.
Nothing they come up with can be wrong, it’s all innovative!!
The follow on. Lots and LOTS of unrelated changes can be a symptom of an immature codebase/product, simply a new endeavor.
If it’s a greenfield project, in order to move fast you don’t want to gold plate or over predictive future. This often means you run into misc design blockers constantly. Which often necessitate refactors & improvements along the way. Depending on the team this can be broken out into the refactor, then the feature, and reviewed back-to-back. This does have it’s downsides though, as the scope of the design may become obfuscated and may lead to ineffective code review.
Ofc mature codebases don’t often suffer from the same issues, and most of the foundational problems are solved. And patterns have been well established.
/ramble
There is no context here though?
If this is a breaking change to a major upgrade path, like a major base UI lib change, then it might not be possible to be broken down into pieces without tripping or quadrupling the work (which likely took a few folks all month to achieve already).
I remember in a previous job migrating from Vue 1 to Vue 2. And upgrading to an entirely new UI library. It required partial code freezes, and we figured it had to be done in 1 big push. It was only 3 of us doing it while the rest of the team kept up on maintenance & feature work.
The PR was something like 38k loc, of actual UI code, excluding package/lock files. It took the team an entire dedicated week and a half to review, piece by piece. We chewet through hundreds of comments during that time. It worked out really well, everyone was happy, the timelines where even met early.
The same thing happened when migrating an asp.net .Net Framework 4.x codebase to .Net Core 3.1. we figured that bundling in major refactors during the process to get the biggest bang for our buck was the best move. It was some light like 18k loc. Which also worked out similarly well in the end .
Things like this happen, not that infrequently depending on the org, and they work out just fine as long as you have a competent and well organized team who can maintain a course for more than a few weeks.
Just a few hundred?
That’s seems awfully short no? We’re talking a couple hours of good flow state, that may not even be a full feature at that point 🤔
We have folks who can push out 600-1k loc covering multiple features/PRs in a day if they’re having a great day and are working somewhere they are proficient.
Never mind important refactors that might touch a thousand or a few thousand lines that might be pushed out on a daily basis, and need relatively fast turnarounds.
Essentially half of the job of writing code is also reviewing code, it really should be thought of that way.
(No, loc is not a unit of performance measurement, but it can correlate)
System.Text.Json routinely fails to be ergonomic, it’s quite inconvenient overall actually.
JSON is greedy, but System.Text.Json isn’t, and falls over constantly for common use cases. I’ve been trying it out on new projects every new releases since .net core 2 and every time it burns me.
GitHub threads for requests for sane defaults, more greedy behavior, and better DevX/ergonomics are largely met with disdain by maintainers. Indicating that the state of System.Text.Json is unlikely to change…
I really REALLY want to use the native tooling, that’s what makes .Net so productive to work in. But JSON handling & manipulation is an absolute nightmare still.
Would not recommend.
Rider is great, it’s 100% worth the money.
Switched over to it this year from VS, it’s so good in comparison. There’s some things that aren’t as nice (the CPU/memory graphs in VS are actually nice and handy). But overall, an upgrade.
VS Code honestly kind of sucks for it, there’s just so many small things missing or lacking.
Check out Rider, I was honestly surprised and switched over to it after 8 years of visual studio.
It’s not game changing because someone else tried to do this and failed
Kind of a weird point to try and make no?
I think you can have a well tended garden without giving up creativity.
You’re not sacrificing creativity by practicing structures, considerations, and methodologies that maintain or improve the developer experience with whatever creative endeavor you’re on.
The structure of your garden doesn’t prevent you from playing around with new plants, it just outlines a set of patterns and expectations known to drive better outcomes.
I’m not saying that your extension of the analogy is bad I’m just disagreeing with some of the premise.
Pretty much.
For instance focusing on PR size. PR size may be a side effect of the maturity of the product, the type of work being performed, the complexity or lack thereof of the real world space their problems touch, and in the methodologies habits and practices of the team.
Just looking at PR size or really any other single dimensional KPI lead you to lose the nuance that was driving the productivity in the first place.
Honestly in my experience high productivity comes from a high level of unity in how the team thinks, approaches problems, and how diligent they are about their decisions. And isn’t necessarily something that’s strictly learned, it can be about getting the right people together.
That’s not a licence thing, it’s a privacy & security thing. Whos APIs are they using? What are their agreements with them? What leaks, what doesn’t? Where is our code & context being sent to…etc
There is a lot more there that should be announced with it. Otherwise it’s a hard no from security focused orgs unless they have this posted, in detail, somewhere, and it’s favorable.
Edit: Looks like they just post who the providers are, and it’s OpenAI. So that’s gonna be a no unless we can bring our own APIs, since we have Azure GPT-3.5 & 4 access that meets opsec standards.
This is the kinda stuff I expect to find in this kind of community! ADRs are a good topic that can help teams act more mature.
And less general career questions and low-level “what technology should I learn” 🤔
How difficult would you expect it to be to go back and produce ADRs for significant decisions in the past that resulted in the current architecture and structure of a small-medium sized project?
Oh definitely. I have some gripes with C# as a language that I wish for better. But it’s extremely flexible.
Well, let’s start with which ones you have looked at, and why they aren’t to your liking.
If you’re trying to build a company then stop using Blazor, and start using react/vue…etc for your frontend and make an Asp.Net API.
If you need a web app, then use well known technology to do it. Otherwise you’re just playing in the sandbox, not building something that can be built quickly, and can be easily onboarded to.
So many C# devs are scared of FE tech, learning how to use it effectively will only make your work better, and speedier.
Essentially, use boring technologies and be pragmatic.
Because your conservative funded news outlets have a very overt goal here.