Every developer (~400) has a Copilot license of which about 50% are actively using it (we’ll start pruning licenses next month).
My experience so far is that it’s biggest benefit is writing tests and refactoring code. The major downside is that it has a habit of introducing very subtle bugs that are easy to miss even with human review.
Pretty close to hundred percent. Though, I work for Microsoft which is a bit biased.
Don’t know a single person who uses it, neither privately nor professionally. We don’t do much boilerplate and write a bunch of stuff that would take longer to describe in a prompt than write itself.
CC BY-NC-SA 4.0
why do some people post this link at the end of their comments?
To state that their content as posted is covered by that license.
Not at all in my org, as far as I know. We are a team of senior engineers somewhat set in our ways and I am not sure how good Copilot plugin for Emacs is.
We are part of a large company and we had a mandate from up top to come up with ways to incorporate AI into our product. We prototyped a few, but could never get it batter than “almost good enough to be useful”. Other teams have presented promising prototypes of inhouse AI assistants that we can incorporate into products.
My team pivoted to the inverse: seeing if we can make our product more useful to ML developers.
seeing if we can make our product more useful to ML developers.
Nice. I’ve heard the folks selling gold digging supplies were the biggest winners in the gold rush.
Literally zero and a large chunk of the people you know have most likely actively used the software the company makes for more than an hour a day (pretty much regardless of country). If not, its replaced by software that serves the same goal from a different provider. The teams we work with are equally not using it.
AI is not the future. It is not a tool worthy of use, and it is a major distraction from growing your skills. Especially if you are junior.
So are we now pretending that autogenerated code hasn’t existed for years?
It really does need to be stated that AI code completion is indeed NOT a learning tool. It’s an accelerated “copy/paste code from stack overflow” tool. Useful in its own right if you just want some rough code fast, but it’s not going to teach you anything. There is no easy way out of having to deeply understand code. It’s your job as a programmer.
I feel like this is a bad binary. It’s more shades of grey. As an experienced programmer, code completion, like Intellisense and bash tab completion, accelerates my ability to do new things. I can pick up a new language and be productive faster than when I had to buy a book and look up syntax or later Google syntax. I am learning because I have something roughed out and compiling that I can then pull apart and understand. When I was a kid learning Perl for CGI I just copied code from books and learned a lot by trying to put different pieces together. It usually didn’t work and that led me to learn more about how Perl worked. Code generation gets the big picture done and knocks out tests really well, allowing me to focus on learning the things I really want to learn, like how I should be optimizing my data structures.
I think we can all agree that if you just blindly tab complete Copilot suggestions you won’t learn anything. Your code also won’t run.
No one uses “ai” assistants. Everyone is aware that they are terrible at most things.
Not at all, officially speaking; giant corporation BTW. Biggest reason is that security and data governance is still figuring out which ones they can run on-prem to prevent any data from being copied to a non-EU server (preferrably no external server at all). Github Copilot is giant no-no, MS Copilot may be an option. We’ll see.
AI is the future BTW, providing massive worth. It’s a great tool to increase your abilities and skill , to be able to provide a fuckton of value within a short time, instead of having to wait for people to grow into their job. Especially if you are junior.