Encountered an error while developing the Idea plugin

        OllamaChatModel ollamaChatModel = AiUtil.gainOllamaChatModelInstance();
        String prompt = textBeforeCrate + ",需要一个简洁、易于理解且注释详细的Java函数,只返回代码即可";

        ollamaChatModel.stream(prompt)
                .filter(e -> !e.contains("```java") && !e.contains("```"))
                .publishOn(Schedulers.boundedElastic()) // 保证非阻塞
                .doOnNext(chunk -> {
                    ApplicationManager.getApplication().invokeLater(() -> {
                        WriteCommandAction.runWriteCommandAction(editor.getProject(), () -> {
                            editor.getInlayModel().addInlineElement(offset, true, new InlineElementRenderer(editor, chunk, false));
                        });
                    });
                })
                .doOnError(error -> {
                    System.err.println("生成代码时出错: " + error.getMessage());
                })
                .subscribe();

The above is my code. I want to use Inlay to print the data returned from AI in Stream format using the Flux function, but it always reports the error “No ContextAccessor for contextType: class reactor. til. context. Context2”

import org.jetbrains.intellij.platform.gradle.IntelliJPlatformType
import org.jetbrains.intellij.platform.gradle.models.ProductRelease

plugins {
    id("java")
    id("org.jetbrains.intellij.platform") version "2.2.1"
}

group = "org.hai.work"
version = "1.4-SNAPSHOT"

repositories {
    mavenCentral()
    intellijPlatform {
        defaultRepositories()
    }
}

dependencies {
    // 指定 IntelliJ IDEA 版本
    intellijPlatform {
        intellijIdeaCommunity("2024.3.2.1")
    }

    // Spring AI for OpenAI(排除 Jackson & reactor-core,避免与 IntelliJ 冲突)
    implementation("org.springframework.ai:spring-ai-openai:1.0.0-M5") {
        exclude group: "com.fasterxml.jackson.core"
        exclude group: "com.fasterxml.jackson.databind"
        exclude group: "com.fasterxml.jackson.annotation"
//        exclude group: "io.projectreactor", module: "reactor-core"
    }

    // Spring AI for Ollama(同样排除相关依赖)
    implementation("org.springframework.ai:spring-ai-ollama:1.0.0-M5") {
        exclude group: "com.fasterxml.jackson.core"
        exclude group: "com.fasterxml.jackson.databind"
        exclude group: "com.fasterxml.jackson.annotation"
//        exclude group: "io.projectreactor", module: "reactor-core"
    }

    // 常用工具库
    implementation("com.google.guava:guava:33.4.0-jre")
    implementation("commons-io:commons-io:2.18.0")
    implementation("org.apache.commons:commons-lang3:3.17.0")
    implementation("commons-collections:commons-collections:3.2.2")

    // Reactor 和上下文传播(用于支持 Flux 流式处理)
//    implementation("io.projectreactor:reactor-core:3.5.7")
//    implementation("io.micrometer:context-propagation:1.1.0")

}

intellijPlatform {
    pluginVerification {
        ides {
            recommended()
            select {
                it.types = [IntelliJPlatformType.IntellijIdeaCommunity]  // 可改为 IntelliJPlatformType.IdeaCommunity 等
                it.channels = [ProductRelease.Channel.RELEASE]
                it.sinceBuild = "233"
                it.untilBuild = "253.*"
            }
        }
    }
    pluginConfiguration {
        ideaVersion {
            sinceBuild = "233"
            untilBuild = "253.*"
        }
    }
}

// 编译参数配置
tasks.withType(JavaCompile).configureEach {
    options.encoding = "UTF-8"
    options.compilerArgs += ["-Xlint:unchecked", "-Xlint:deprecation", "-parameters"]
}

试试用这个接口来实现InlineCompletionProvider,扩展点inline.completion.provider,网上有相关的例子:https://plugins.jetbrains.com/intellij-platform-explorer/extensions?extensions=com.intellij.inline.completion.provider