【Scala】构建Scala风格的SSH工具类
注意事项
这里需要特别提醒的是,使用nohup的时候,需要将所有的三类输出流重定向,即如上所示:nohup xxx.sh > xxx.log 2>&1 &
另外需要关闭伪终端功能:openChannel.setPty(false),否则会出现nohup脚本无法正常启动执行,但是去掉nohup又能正常的情况,具体可以参考如下一段话:
A different issue that often arises in this situation is that ssh is refusing to log off (“hangs”), since it refuses to lose any data from/to the background job(s).[5][6] This problem can also be overcome by redirecting all three I/O streams:
引入jsch依赖
<dependency>
<groupId>org.apache.ant</groupId>
<artifactId>ant-jsch</artifactId>
<version>1.9.7</version>
</dependency>
<dependency>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
<version>0.1.51</version>
</dependency>
代码实现
package com.changtu.util.host
import java.io.{BufferedReader, InputStreamReader}
import com.changtu.util.Logging
import com.jcraft.jsch.{ChannelExec, JSch}
/**
* Created by lubinsu on 8/11/2016.
*/
object SSH extends Logging {
def apply(remoteMachine: String, userName: String, port: Int = 22, command: String, keyfile: String, password: String, outputFum: (String, String) => Unit = { (msg, host) =>
logger.info(msg)
}): Int = {
val jsch = new JSch()
//jsch.addIdentity(keyfile)
val session = jsch.getSession(userName, remoteMachine, port)
//设置登陆主机的密码
//设置密码
session.setPassword(password)
//设置第一次登陆的时候提示,可选值:(ask | yes | no)
session.setConfig("StrictHostKeyChecking", "no")
//设置登陆超时时间
session.connect(5000)
val openChannel = session.openChannel("exec").asInstanceOf[ChannelExec]
openChannel.setCommand(command)
//openChannel.setOutputStream(output)
openChannel.setPty(false)
val input = openChannel.getInputStream
openChannel.connect()
val bufferReader = new BufferedReader(new InputStreamReader(input))
while (!(openChannel.isClosed || input.available() < 0)) {
while (bufferReader.ready()) {
val msg = bufferReader.readLine
//执行传入的函数处理返回信息
outputFum(msg, remoteMachine)
}
Thread.sleep(500)
}
val exit = openChannel.getExitStatus
logger.info(command.concat(",ssh command exit status: ").concat(exit.toString))
bufferReader.close()
openChannel.disconnect()
session.disconnect()
exit
}
}
调用示例
val ssh = (cmd: String) => SSH(host, "hadoop", port.toInt, cmd, "", password)
ssh("nohup java -Djava.ext.dirs=/appl/scripts/e-business/rest/target/lib -classpath /appl/scripts/e-business/rest/target/rest-1.1.jar com.changtu.service.BackendServiceBoot 4444 > /appl/scripts/e-business/rest/out.follower.$$.log 2>&1 &")
还可以这么调用
val ssh = (cmd: String) => SSH(host, "hadoop", port.toInt, cmd, "", password, loadToHdfs)
/**
* 收集到的日志处理方式
* @param msg 传入一行行记录
*/
private def loadToHdfs(msg: String, host: String): Unit = {
//logger.info(msg)
val currentTime = DateTime.now.toString("yyyyMMdd")
val path = "/user/hadoop/bigdata/logs/rest.".concat(host).concat("-").concat(currentTime).concat(".log")
HDFSUtils.createDirectory(path, deleteF = false)
val fsout = HDFSUtils.getHdfs.append(new Path(path))
val br = new BufferedWriter(new OutputStreamWriter(fsout))
br.write(msg)
br.newLine()
br.close()
fsout.close()
}
Heⅼlo there, You haѵe done ɑn excellent job. I’ll ceгtainly dіgg it
and personally suggest to my friends. I am sure they will be benefited fгom this site.
Heck of a job there, it aboulstely helps me out.
This website is really cool. I have bookmarked it. Do you allow guest posting on your site ?
I can provide high quality articles for you.
Let me know.
Of course.I feel very happy.
That’s a genuinely imsisperve answer.
hadoop/spark大数据交流群(微信),欢迎做大数据的大牛们加入,群主ID:a102420483096
The voice of raaotntliiy! Good to hear from you.
Your article was exlelcent and erudite.
Hello there, You have done an incredible job. I will certainly digg it and personally recommend to my friends. I am sure they’ll be benefited from this site.
Thanks for every other informative web site. The place else may just I am getting that kind of info written in such an ideal means? I’ve a undertaking that I am simply now running on, and I’ve been at the look out for such info.