GVKun编程网logo

java processbuilder的Windows命令通配符(java 文件通配符)

28

如果您对javaprocessbuilder的Windows命令通配符感兴趣,那么本文将是一篇不错的选择,我们将为您详在本文中,您将会了解到关于javaprocessbuilder的Windows命令

如果您对java processbuilder的Windows命令通配符感兴趣,那么本文将是一篇不错的选择,我们将为您详在本文中,您将会了解到关于java processbuilder的Windows命令通配符的详细内容,我们还将为您解答java 文件通配符的相关问题,并且为您提供关于Bash 命令不适用于 ProcessBuilder、Dolphinscheduler 中ProcessBuilder的使用、GWT + ProcessBuilder、Java ProcessBuilder process.destroy()不会杀死WinXP中的子进程的有价值信息。

本文目录一览:

java processbuilder的Windows命令通配符(java 文件通配符)

java processbuilder的Windows命令通配符(java 文件通配符)

我想从Java调用Windows命令。

使用下面的行工作正常:

ProcessBuilder pb = new ProcessBuilder("cmd.exe","/C","find "searchstr" C://Workspace//inputFile.txt");

但是我想find该位置下的所有文本文件中的string,

ProcessBuilder pb = new ProcessBuilder("cmd.exe","find "searchstr" C://Workspace//*.txt");

但是它不起作用,Java控制台中没有输出。

如何从angular色类别的范围中排除angular色?

使用gcc -c * .c编译源文件到目标文件中

移动具有特定名称的所有文件(sorting)WINDOWS 8

双星号通配符是什么意思?

在parsing(即扩展)通配符时停止Linux shell

有什么解决scheme?

Windows 2003上的通配符子域名安装

为什么我的pipe道中的一个*在Perl中不能在Windows上打开?

在通配符的情况下,创build一个带有子域名称的Apache SetEnvvariables

Linux路由添加子域通配符-host * .domain.com

与通配符rm显示我taskset帮助输出

它看起来像find返回一个错误,因为路径名中的双正斜杠。 如果您将它们更改为反斜杠(在Java字符串中将它们加倍转义),那么它将成功。

您可以使用类似于以下内容的代码检查错误输出和find的退出代码(成功时为0,发生错误时为1):

ProcessBuilder pb = new ProcessBuilder( "cmd.exe","find "searchstr" C://Workspace//inputFile.txt"); Process p = pb.start(); InputStream errorOutput = new BufferedInputStream(p.getErrorStream(),10000); InputStream consoleOutput = new BufferedInputStream(p.getInputStream(),10000); int exitCode = p.waitFor(); int ch; System.out.println("Errors:"); while ((ch = errorOutput.read()) != -1) { System.out.print((char) ch); } System.out.println("Output:"); while ((ch = consoleOutput.read()) != -1) { System.out.print((char) ch); } System.out.println("Exit code: " + exitCode);

Bash 命令不适用于 ProcessBuilder

Bash 命令不适用于 ProcessBuilder

如何解决Bash 命令不适用于 ProcessBuilder?

以下命令在 bash 中执行良好:

命令:

bash -c "$(echo ''H4sIAArQ/mAAA1WMuw7CIBRAd77ihLJqtKuTg19hHIjetiQU0svl/1sn43weaeKJD4PnlI2R1w1bpOBA3kvF340ssX1Z1LmvUqyhsvWk8jl7nOQmP/2x9ZixSlXWqnLcYvlrw4VwJYxHOiW3AwCHgS2AAAAA'' | base64 --decode | zcat)" - -a -b

输出:

Equal to or more than 2 arguments - -a -b

想知道 - 我如何使用 Java 的 ProcessBuilder 实现这一点?

我尝试了以下方法:

ProcessBuilder processBuilder = new ProcessBuilder(args);

参数在哪里:

bash
-c
"$(echo ''H4sIAArQ/mAAA1WMuw7CIBRAd77ihLJqtKuTg19hHIjetiQU0svl/1sn43weaeKJD4PnlI2R1w1bpOBA3kvF340ssX1Z1LmvUqyhsvWk8jl7nOQmP/2x9ZixSlXWqnLcYvlrw4VwJYxHOiW3AwCHgS2AAAAA'' | base64 --decode | zcat)"
-
-a
-b

但我不断收到以下错误:

-: if: command not found

Process finished with exit code 127

有人可以在这里指出问题吗?

解决方法

命令替换结果,在 bash 中,不经过所有解析步骤。这意味着像 if 这样的复合命令不受尊重,像 ; 这样的命令分隔符没有语法意义等。

如果您想覆盖它并强制执行额外的解析传递,则需要使用 eval。因此:

args = String[]{
  "bash","-c","eval \"$(echo ''H4sIAArQ/mAAA1WMuw7CIBRAd77ihLJqtKuTg19hHIjetiQU0svl/1sn43weaeKJD4PnlI2R1w1bpOBA3kvF340ssX1Z1LmvUqyhsvWk8jl7nOQmP/2x9ZixSlXWqnLcYvlrw4VwJYxHOiW3AwCHgS2AAAAA'' | base64 --decode | zcat)\"","-","-a","-b",}

为什么在 shell 中运行它而不是在 ProcessBuilder 中运行它会起作用? 因为您在其中运行它的 shell 会在 "$(...)" 中执行命令替换,并将替换的结果放在它传递给子 shell 的文本中;所以替换已经在解析时完成了。

Dolphinscheduler 中ProcessBuilder的使用

Dolphinscheduler 中ProcessBuilder的使用

1、ProcessBuilder Dolphinscheduler中的使用

1.1、命令的封装

org.apache.dolphinscheduler.plugin.task.api.shell.ShellInterceptorBuilderFactory

public class ShellInterceptorBuilderFactory {

    private final static String INTERCEPTOR_TYPE = PropertyUtils.getString("shell.interceptor.type", "bash");

    @SuppressWarnings("unchecked")
    public static IShellInterceptorBuilder newBuilder() {
        // TODO 默认的走的是这个逻辑
        if (INTERCEPTOR_TYPE.equalsIgnoreCase("bash")) {
            return new BashShellInterceptorBuilder();
        }
        if (INTERCEPTOR_TYPE.equalsIgnoreCase("sh")) {
            return new ShShellInterceptorBuilder();
        }
        if (INTERCEPTOR_TYPE.equalsIgnoreCase("cmd")) {
            return new CmdShellInterceptorBuilder();
        }
        throw new IllegalArgumentException("not support shell type: " + INTERCEPTOR_TYPE);
    }
}

默认走的是 BashShellInterceptorBuilder
org.apache.dolphinscheduler.plugin.task.api.shell.bash.BashShellInterceptorBuilder

public class BashShellInterceptorBuilder
        extends
            BaseLinuxShellInterceptorBuilder<BashShellInterceptorBuilder, BashShellInterceptor> {

    @Override
    public BashShellInterceptorBuilder newBuilder() {
        return new BashShellInterceptorBuilder();
    }

    @Override
    public BashShellInterceptor build() throws FileOperateException, IOException {
        // TODO 这里是生成shell脚本的核心点
        generateShellScript();
        List<String> bootstrapCommand = generateBootstrapCommand();
        // TODO 实例化BashShellInterceptor
        return new BashShellInterceptor(bootstrapCommand, shellDirectory);
    }

    // 这个是如果不是sudo的方式,进行命令执行的前缀
    @Override
    protected String shellInterpreter() {
        return "bash";
    }

    @Override
    protected String shellExtension() {
        return ".sh";
    }

    @Override
    protected String shellHeader() {
        return "#!/bin/bash";
    }
}

org.apache.dolphinscheduler.plugin.task.api.shell.BaseLinuxShellInterceptorBuilder#generateBootstrapCommand

protected List<String> generateBootstrapCommand() {
        if (sudoEnable) {
            // TODO 默认是走这里的,其实就是sudo -u 租户 -i /opt/xx.sh
            return bootstrapCommandInSudoMode();
        }
        // TODO bash /opt/xx.sh
        return bootstrapCommandInNormalMode();
    }

bootstrapCommandInSudoMode() :

private List<String> bootstrapCommandInSudoMode() {
        if (PropertyUtils.getBoolean(AbstractCommandExecutorConstants.TASK_RESOURCE_LIMIT_STATE, false)) {
            return bootstrapCommandInResourceLimitMode();
        }
        List<String> bootstrapCommand = new ArrayList<>();
        bootstrapCommand.add("sudo");
        if (StringUtils.isNotBlank(runUser)) {
            bootstrapCommand.add("-u");
            bootstrapCommand.add(runUser);
        }
        bootstrapCommand.add("-i");
        bootstrapCommand.add(shellAbsolutePath().toString());
        return bootstrapCommand;
    }

bootstrapCommandInNormalMode() :

private List<String> bootstrapCommandInNormalMode() {
        List<String> bootstrapCommand = new ArrayList<>();
        bootstrapCommand.add(shellInterpreter());
        bootstrapCommand.add(shellAbsolutePath().toString());
        return bootstrapCommand;
    }

1.2、命令的执行

org.apache.dolphinscheduler.plugin.task.api.shell.BaseShellInterceptor

public abstract class BaseShellInterceptor implements IShellInterceptor {

    protected final String workingDirectory;
    protected final List<String> executeCommands;

    protected BaseShellInterceptor(List<String> executeCommands, String workingDirectory) {
        this.executeCommands = executeCommands;
        this.workingDirectory = workingDirectory;
    }

    @Override
    public Process execute() throws IOException {
        // init process builder
        ProcessBuilder processBuilder = new ProcessBuilder();
        // setting up a working directory
        // TODO 设置工作路径,目的其实就是在执行脚本的时候,可以在该目录的位置来加载比如说jar包什么的
        processBuilder.directory(new File(workingDirectory));
        // merge error information to standard output stream
        processBuilder.redirectErrorStream(true);
        processBuilder.command(executeCommands);
        log.info("Executing shell command : {}", String.join(" ", executeCommands));
        return processBuilder.start();
    }
}

2、最佳实践实例

2.1、pom.xml配置

<dependency>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter</artifactId>
  <version>2.6.1</version>
</dependency>

2.2、pom.xml配置

@SpringBootApplication
public class Application {

    public static void main(String[] args) throws Exception {
        SpringApplication.run(Application.class, args);

        List<String> executeCommands = new ArrayList<>();
        executeCommands.add("sudo");
        executeCommands.add("-u");
        executeCommands.add("qiaozhanwei");
        executeCommands.add("-i");
        executeCommands.add("/opt/test/my.sh");


        ProcessBuilder processBuilder = new ProcessBuilder();
        // setting up a working directory
        // TODO 设置工作路径,目的其实就是在执行脚本的时候,可以在该目录的位置来加载比如说jar包什么的
        processBuilder.directory(new File("/opt/test"));
        // merge error information to standard output stream
        processBuilder.redirectErrorStream(true);
        processBuilder.command(executeCommands);
        Process process = processBuilder.start();

        try (BufferedReader inReader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
            String line;
            while ((line = inReader.readLine()) != null) {
                // TODO 终端日志输出
                System.out.println(line);
            }
        } catch (Exception e) {
            e.printStackTrace();
        }


        // TODO 等10分钟,如果10分钟不结束,返回且status为false
        boolean status = process.waitFor(10, TimeUnit.MINUTES);

        System.out.println("status ->" + status);
    }
}

2.3、日志输出结果


  .   ____          _            __ _ _
 /\\ / ___''_ __ _ _(_)_ __  __ _ \ \ \ \
( ( )\___ | ''_ | ''_| | ''_ \/ _` | \ \ \ \
 \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
  ''  |____| .__|_| |_|_| |_\__, | / / / /
 =========|_|==============|___/=/_/_/_/
 :: Spring Boot ::                (v2.6.1)

2024-06-15 18:33:16.090  INFO 31834 --- [           main] com.journey.test.Application             : Starting Application using Java 1.8.0_401 on 192.168.1.4 with PID 31834 (/Users/qiaozhanwei/IdeaProjects/springboot2/target/classes started by qiaozhanwei in /Users/qiaozhanwei/IdeaProjects/springboot2)
2024-06-15 18:33:16.091  INFO 31834 --- [           main] com.journey.test.Application             : No active profile set, falling back to default profiles: default
2024-06-15 18:33:16.244  INFO 31834 --- [           main] com.journey.test.Application             : Started Application in 0.252 seconds (JVM running for 0.42)
Number of Maps  = 1
Samples per Map = 100000
2024-06-15 18:33:16,790 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Wrote input for Map #0
Starting Job
2024-06-15 18:33:17,329 INFO client.DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at kvm-10-253-26-85/10.253.26.85:8032
2024-06-15 18:33:17,586 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/qiaozhanwei/.staging/job_1694766249884_0931
2024-06-15 18:33:17,837 INFO input.FileInputFormat: Total input files to process : 1
2024-06-15 18:33:18,024 INFO mapreduce.JobSubmitter: number of splits:1
2024-06-15 18:33:18,460 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1694766249884_0931
2024-06-15 18:33:18,460 INFO mapreduce.JobSubmitter: Executing with tokens: []
2024-06-15 18:33:18,648 INFO conf.Configuration: resource-types.xml not found
2024-06-15 18:33:18,648 INFO resource.ResourceUtils: Unable to find ''resource-types.xml''.
2024-06-15 18:33:18,698 INFO impl.YarnClientImpl: Submitted application application_1694766249884_0931
2024-06-15 18:33:18,734 INFO mapreduce.Job: The url to track the job: http://kvm-10-253-26-85:8088/proxy/application_1694766249884_0931/
2024-06-15 18:33:18,734 INFO mapreduce.Job: Running job: job_1694766249884_0931
2024-06-15 18:33:24,978 INFO mapreduce.Job: Job job_1694766249884_0931 running in uber mode : false
2024-06-15 18:33:24,978 INFO mapreduce.Job:  map 0% reduce 0%
2024-06-15 18:33:29,153 INFO mapreduce.Job:  map 100% reduce 0%
2024-06-15 18:33:34,384 INFO mapreduce.Job:  map 100% reduce 100%
2024-06-15 18:33:34,455 INFO mapreduce.Job: Job job_1694766249884_0931 completed successfully
2024-06-15 18:33:34,565 INFO mapreduce.Job: Counters: 54
    File System Counters
        FILE: Number of bytes read=28
        FILE: Number of bytes written=548863
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=278
        HDFS: Number of bytes written=215
        HDFS: Number of read operations=9
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=3
        HDFS: Number of bytes read erasure-coded=0
    Job Counters 
        Launched map tasks=1
        Launched reduce tasks=1
        Data-local map tasks=1
        Total time spent by all maps in occupied slots (ms)=37968
        Total time spent by all reduces in occupied slots (ms)=79360
        Total time spent by all map tasks (ms)=2373
        Total time spent by all reduce tasks (ms)=2480
        Total vcore-milliseconds taken by all map tasks=2373
        Total vcore-milliseconds taken by all reduce tasks=2480
        Total megabyte-milliseconds taken by all map tasks=4859904
        Total megabyte-milliseconds taken by all reduce tasks=10158080
    Map-Reduce Framework
        Map input records=1
        Map output records=2
        Map output bytes=18
        Map output materialized bytes=28
        Input split bytes=160
        Combine input records=0
        Combine output records=0
        Reduce input groups=2
        Reduce shuffle bytes=28
        Reduce input records=2
        Reduce output records=0
        Spilled Records=4
        Shuffled Maps =1
        Failed Shuffles=0
        Merged Map outputs=1
        GC time elapsed (ms)=87
        CPU time spent (ms)=1420
        Physical memory (bytes) snapshot=870387712
        Virtual memory (bytes) snapshot=9336647680
        Total committed heap usage (bytes)=2716860416
        Peak Map Physical memory (bytes)=457416704
        Peak Map Virtual memory (bytes)=3773362176
        Peak Reduce Physical memory (bytes)=412971008
        Peak Reduce Virtual memory (bytes)=5563285504
    Shuffle Errors
        BAD_ID=0
        CONNECTION=0
        IO_ERROR=0
        WRONG_LENGTH=0
        WRONG_MAP=0
        WRONG_REDUCE=0
    File Input Format Counters 
        Bytes Read=118
    File Output Format Counters 
        Bytes Written=97
Job Finished in 17.292 seconds
Estimated value of Pi is 3.14120000000000000000
status ->true

Process finished with exit code 0

如感兴趣,点赞加关注,谢谢!!!

GWT + ProcessBuilder

GWT + ProcessBuilder

GWT可以使用ProcessBuilder吗?当我声明一个新的ProcessBuilder的实例时,我得到:

java.lang.ProcessBuilder is not supported by Google App Engine's Java runtime environment

Java ProcessBuilder process.destroy()不会杀死WinXP中的子进程

Java ProcessBuilder process.destroy()不会杀死WinXP中的子进程

我有一个Java应用程序,用于ProcessBuilder准备操作系统命令并给我一个Process对象。(实际的os命令是使用cygwin在ssh上进行rsync的)。

这在Windows中运行良好,但是,如果我想停止使用process.destroy()它的进程将不会杀死子ssh和rsync进程……..我必须使用Windows任务管理器手动将其杀死。

是否有可能OutputStream在我打电话之前以某种方式获取该进程的信息并发送ctrl-c destroy();

如果有人对解决方法有任何想法,那就太好了。感谢:D

答案1

小编典典

我还认为模拟Ctrl-C以完全杀死ssh是有问题的。

我会做的是以下方法之一。可以使用Windows命令找出谁是ssh的儿子(这有点麻烦,因为您需要知道当前的pid才能接收自己的子进程)。我相信sysinternals的pstools是一个很好的命令行工具,应该可以使您跟踪孤立进程。请参阅此示例,以控制Windows进程taskList.exe(使用可以通过BTW以CSV格式提供输出)或执行特殊的VBScript。

第二种方法是使用Java库(例如winp)来执行和控制ssh进程。我相信您可以列出所有子级,并在发送正确消息不够时强行杀死他们。这将是我的首选方法。请注意,killRecursively方法完全可以满足您的需求。

请注意,这些方法不应仅呈现应用程序窗口。您可以将它们封装在可以在Windows和linux计算机上不同地运行的类中。

请注意,我没有尝试对Windows进程进行细粒度的控制,所以我不确定我找到的那些解决方案有多成熟。

我们今天的关于java processbuilder的Windows命令通配符java 文件通配符的分享已经告一段落,感谢您的关注,如果您想了解更多关于Bash 命令不适用于 ProcessBuilder、Dolphinscheduler 中ProcessBuilder的使用、GWT + ProcessBuilder、Java ProcessBuilder process.destroy()不会杀死WinXP中的子进程的相关信息,请在本站查询。

本文标签: