Skip to content

Commit 0f3a5ff

Browse files
committed
Monitor now directly creates current.ongoing instead of waiting for logs to be appeneded
1 parent 5c6b084 commit 0f3a5ff

File tree

3 files changed

+16
-7
lines changed

3 files changed

+16
-7
lines changed

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
## Overview
66

77

8-
Version: 1.1.0
8+
Version: 1.1.1
99

1010
API Scaladoc: [SparkHelper](http://xavierguihot.com/spark_helper/#com.spark_helper.SparkHelper$)
1111

@@ -126,7 +126,7 @@ assert(DateHelper.nDaysAfterDate(3, "20170307") == "20170310")
126126
### Monitor:
127127

128128
The full list of methods is available at
129-
[scaladoc](http://xavierguihot.com/spark_helper/#com.spark_helper.Monitor$)
129+
[Monitor](http://xavierguihot.com/spark_helper/#com.spark_helper.Monitor$)
130130

131131
It's a simple logger/report which contains a report that one can update from
132132
the driver and a success state. The idea is to persist job executions logs and
@@ -253,7 +253,7 @@ With sbt, add these lines to your build.sbt:
253253
```scala
254254
resolvers += "jitpack" at "https://jitpack.io"
255255

256-
libraryDependencies += "com.github.xavierguihot" % "spark_helper" % "v1.1.0"
256+
libraryDependencies += "com.github.xavierguihot" % "spark_helper" % "v1.1.1"
257257
```
258258

259259
With maven, add these lines to your pom.xml:
@@ -269,7 +269,7 @@ With maven, add these lines to your pom.xml:
269269
<dependency>
270270
<groupId>com.github.xavierguihot</groupId>
271271
<artifactId>spark_helper</artifactId>
272-
<version>v1.1.0</version>
272+
<version>v1.1.1</version>
273273
</dependency>
274274
```
275275

@@ -283,6 +283,6 @@ allprojects {
283283
}
284284
285285
dependencies {
286-
compile 'com.github.xavierguihot:spark_helper:v1.1.0'
286+
compile 'com.github.xavierguihot:spark_helper:v1.1.1'
287287
}
288288
```

build.sbt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name := "spark_helper"
22

3-
version := "1.1.0"
3+
version := "1.1.1"
44

55
scalaVersion := "2.11.12"
66

src/main/scala/com/spark_helper/Monitor.scala

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -185,6 +185,7 @@ object Monitor {
185185
def setTitle(title: String): Unit = {
186186
reportTitle = Some(title)
187187
reportHeader = buildReportHeader()
188+
storeCurrent()
188189
}
189190

190191
/** Sets the report's contact list.
@@ -206,6 +207,7 @@ object Monitor {
206207
def addContacts(contacts: List[String]): Unit = {
207208
pointsOfContact = Some(contacts)
208209
reportHeader = buildReportHeader()
210+
storeCurrent()
209211
}
210212

211213
/** Sets the report's description.
@@ -227,6 +229,7 @@ object Monitor {
227229
def addDescription(description: String): Unit = {
228230
reportDescription = Some(description)
229231
reportHeader = buildReportHeader()
232+
storeCurrent()
230233
}
231234

232235
/** Sets the folder in which logs are stored.
@@ -250,6 +253,7 @@ object Monitor {
250253
def setLogFolder(logFolder: String): Unit = {
251254
logDirectory = Some(logFolder)
252255
prepareLogFolder()
256+
storeCurrent()
253257
}
254258

255259
/** Activates the purge of logs and sets the purge window.
@@ -573,6 +577,12 @@ object Monitor {
573577

574578
// And if the logFolder parameter has been set, we also update live the log
575579
// file:
580+
storeCurrent()
581+
}
582+
583+
/** Updates the current stored version of logs in file
584+
* logFolder/current.ongoing */
585+
private def storeCurrent(): Unit =
576586
logDirectory.foreach {
577587
case logFolder => {
578588

@@ -588,7 +598,6 @@ object Monitor {
588598
HdfsHelper.writeToHdfsFile(ongoingReport, s"$logFolder/current.ongoing")
589599
}
590600
}
591-
}
592601

593602
private def purgeOutdatedLogs(logFolder: String, window: Int): Unit = {
594603

0 commit comments

Comments
 (0)