JetBrains AI Launched: What You Need to Know


Do you get LLM-based AI support in JetBrains products? To achieve this, until now it was necessary to rely on external solutions such as CodeWhisperer or GitHub Copilot. Now there is an internal service. Here are some things about him.

JetBrains AI, how does it work?

Individuals and organizations with commercial licenses can additionally subscribe to the “JetBrains AI Service”.

This gives access to the “JetBrains AI Assistant”. That is, a range of AI assistance features in individual publisher products. More precisely:

– Its commercial FDI
– Its ReSharper extension for Visual Studio
– Its multilingual Fleet IDE, currently in public preview

Once a subscription is purchased, it is active on all JetBrains products that offer the AI ​​service.

We are promised, generally speaking:

– Chat interface
– Auto-completion
– Writing documentation and handover messages
– Code generation from the description
– Code explanation (including regular expressions, SQL and cron)
– Explanation of errors and suggested fixes
– Suggested refactorings
– Name suggestions for classes, functions and variables
– Library of encourage personalized
– Conversion between languages

JetBrains plans to integrate into its collaboration tools (YouTrack for project management, TeamCity for CI/CD, Datalore for data science).

How much does it cost ?

The above features are part of the Pro offering. Its price: €12 including tax per month for each individual user (or €10 per month with annual payment).

Organizations need twice as much per user.

JetBrains plans an Enterprise edition for 2024. It will provide, among other things, an on-site installation option and the ability to customize the model based on code.

Which LLMs does JetBrains AI use?

Most functionality is currently based on OpenAI models. JetBrains claims to prepare its own models for auto-completion. There is also talk of tie-ups with Codey and Vertex AI from Google.

In the local scenario, the models offered will depend on the cloud platform: Anthropic for AWS, PaLM 2 for Google Cloud or Azure OpenAI. THE bring-your-own-model is on signpost.

Pending this offer, the use of JetBrains AI includes the transfer of data to the underlying LLMs: encourageof course, but also code fragments, file types, frames used, etc. JetBrains reserves the right to collect more details for the purpose of improving its services, but it is up to the user to agree.

What are the specifics for organizations?

Anyone using their IDE license from their company’s floating license server will need to create an account to use JetBrains AI.
Likewise, we cannot currently distribute JetBrains AI licenses through this channel.

After an administrator activates it, the service may take up to an hour before it becomes available to developers. They can speed up the process manual recovery their IDE licenses.

There is no SLA, be it for individuals or organizations. In order to maintain an optimal level of performance, JetBrains actually allows itself to “adapt access according to usage”…

To block functionality at the project level, you must create, at the root reposfile named .noai.

For additional consultations:

CodeWhisperer transformed into a MongoDB expert
Copilot, but not only: How GitHub feeds LLMs
JetBrains is moving from open source to commercial for Rust
Microsoft is discontinuing Visual Studio for Mac
LLaMa code: a special Python model on the Meta menu

Illustration © sharafmaksumov – Adobe Stock



Source link

Leave a Comment