logo
down
shadow

Using keep-left/right combinator is not working with result converter


Using keep-left/right combinator is not working with result converter

Content Index :

Using keep-left/right combinator is not working with result converter
Tag : scala , By : eataix
Date : November 29 2020, 04:01 AM

like below fixes the issue Since you didn't insert any parentheses in the chain of ~ and <~, most matched subexpressions are thrown out "with the bathwater" (or rather "with the whitespace and arrows"). Just insert some parentheses.
Here is the general pattern what it should look like:
(irrelevant ~> irrelevant ~> RELEVANT <~ irrelevant <~ irrelevant) ~
(irrelevant ~> RELEVANT <~ irrelevant <~ irrelevant) ~ 
...
import scala.util.parsing.combinator._
import scala.util.{Either, Left, Right}

case class SubtitleBlock(startTime: String, endTime: String, text: List[String])

object YourParser extends RegexParsers {

  def subtitleHeader: Parser[SubtitleBlock] = {
    (subtitleNumber.? ~> time <~ arrow) ~ 
    time ~
    (opt(textLine) <~ eol)
  } ^^ {
    case startTime ~ endTime ~ _ => SubtitleBlock(startTime, endTime, Nil)
  }

  override val whiteSpace = "[ \t]+".r
  def arrow: Parser[String] = "-->".r
  def subtitleNumber: Parser[String] = "\\d+".r
  def time: Parser[String] = "\\d{2}:\\d{2}:\\d{2}.\\d{3}".r
  def textLine: Parser[String] = ".*".r
  def eol: Parser[String] = "\n".r

  def parseStuff(s: String): scala.util.Either[String, SubtitleBlock] = 
  parseAll(subtitleHeader, s) match {
    case Success(t, _) => scala.util.Right(t)
    case f => scala.util.Left(f.toString)
  } 

  def main(args: Array[String]): Unit = {
    val examples: List[String] = List(
      "2 00:00:01.610 --> 00:00:02.620 align:start position:0%\n"
    ) ++ args.map(_ + "\n")

    for (x <- examples) {
      println(parseStuff(x))
    }
  }
}
Right(SubtitleBlock(00:00:01.610,00:00:02.620,List()))

Comments
No Comments Right Now !

Boards Message :
You Must Login Or Sign Up to Add Your Comments .

Share : facebook icon twitter icon

Always discard result for a parser combinator


Tag : scala , By : leorick
Date : March 29 2020, 07:55 AM
should help you out In earlier versions of scala the discard method existed to throw away the results of a parser: , You can simply reference the extra parser result but not use it:
otherParser ~ throwThisAway ~ anotherParser ^^ { case a ~ x ~ b => // eg.: SomeCaseClass(a,b)
otherParser ~ throwThisAway ~ anotherParser ^^ { case a ~ _ ~ b => ...

Haskell parsec: `many` combinator inside an `optional` combinator


Tag : haskell , By : Johannes
Date : March 29 2020, 07:55 AM
this one helps. We can also state this slightly different: c in your parser may only succeed if it's followed by any token, which can be done with a single lookAhead:
myParser = many (a <|> b <|> (c <* (lookAhead anyToken <?> "non C token"))) <* eof

Fixed point combinator and explicit result type


Tag : cpp , By : Brian Cupps
Date : March 29 2020, 07:55 AM
Does that help The rule in [dcl.spec.auto] is:
print(print, {1, {{2, {{8}}}, {3, {{5, {{7}}}, {6}}}, {4}}});

When working with css child combinator on list i'm not getting the result I expected


Tag : html , By : Al Dimond
Date : March 29 2020, 07:55 AM
it should still fix some issue You forgot 2 lines in the html. After that, your style will work fine but it will also change the color of D, E and F because they are on the same level as H.
<aside class="sitemap">
  <ul class="pages">
    <li>A</li>
    <li>B</li>
    <li>C
      <ul>
        <li>D</li>
        <li class="featured">E</li>
        <li>F</li>
      </ul>
    </li> <!-- add this line -->
    <li>G</li>
    <li> <!-- add this line -->
      <ul>
        <li>H
          <ul>
            <li class="featured">I</li>
            <li>J</li>
          </ul>
        </li>
      </ul>
    </li>
  </ul>
</aside>

CSS > combinator not getting result I expected


Tag : css , By : wraith
Date : March 29 2020, 07:55 AM
I wish did fix the issue. The > operator only selects the direct (first gen) children of the element. In the case of color, the child elements of that targeted element inherit that style rule:
#outside>li {
  color: green;
}
<ul id="outside">

  <li>A</li>
  <li>B</li>
  <li>C</li>
  <ul>
    <li>D</li>
    <li>E</li>
    <li>F</li>
  </ul>
  <li>G</li>

</ul>
#three>#two {
  color: green;
}
<div id="three">
  three
  <div id="two">
    two
    <header id="one">
      one
      <footer id="zero">
        zero
      </footer>
    </header>
  </div>
</div>
Related Posts Related QUESTIONS :
  • How do Scala macro typechecks resolve identifiers to types?
  • How to solve SBT Dependency Problem with Spark and whisklabs/docker-it-scala
  • flatMap results when read from a file is different from same line passed as a string
  • Merging n CSV strings ignoring headers from every string except the first one
  • Proper way to maintain an SSE connection in Play
  • How to pass a case class into a function parameter and get a schema accordingly?
  • List[Try[T]] to Try[List[T]] in Scala
  • How to loop over array and concat elements into one print statement or variable Spark Scala
  • Type classes vs Data Types in functional Libraries in scala such as Cats and ScalaZ
  • How can I group (in Scala) rows of a dataframe and can I sum values of a column of this rows?
  • Where does this .get(x) behavior come from?
  • case class - combine pattern match
  • Define function that flips its arguments
  • DDD functional way: Why is it better to decouple state from the behavior when applying DDD with functional language?
  • Getting error while running Scala Spark code to list blobs in storage
  • Running a function that returns Try asynchronously
  • Gen.sequence ignores size of given Traversable
  • how to print the index of failure in catch expression in scala?
  • Scala Pattern type is incompatible with expected type
  • Find the top N numbers in an endless stream of integers in a functional programming style
  • Writing file using FileSystem to S3 (Scala)
  • How to change date from yyyy-mm-dd to dd-mm-yyy using Spark function
  • Filtering Dataframe by nested array
  • How do I access a nested function in a scala object
  • Execute operation on Future's value for its side effects, discarding result and retaining original value, but retaining
  • Skewed Window Function & Hive Source Partitions?
  • using the object in the match clause of pattern match in Scala
  • How to execute list of scala futures sequentially
  • is there a way to modify the sbt version of an existing project in IntelliJ IDEA?
  • Is it possible to make a generic function that takes different case classes
  • How do I implement a function that generates natural numbers 1,3,5,7...?
  • how to implement flatten for stream using flatmap
  • Akka Streams - How to check if a stream ran successfully?
  • How can I combine shared tests with fixtures that require cleanup?
  • is it possible (and how) to specify an sql query on command line with spark-submit
  • How do I compare tuple values?
  • Updating column value in loop in spark
  • Parallelism in Cassandra read using Scala
  • How to create my own custom scalafmt style that my projects can depend on?
  • how to sort each line on the file in scala?
  • Iterate boolean comparison over two DataFrames?
  • Eliminating identity wrapper types from Scala APIs
  • When running scala code in spark I get "Task not serializable" , why?
  • Akka - Best Approach to configure an Actor that consumes a REST API service (blocking operation)
  • how to substring a variable string
  • How to write empty data frame headers only to csv file?
  • How to sort data in Scala?
  • What (exactly) are "First Class" modules?
  • How to divide values of two columns with another name in sqlcontext?
  • Is it possible to have a generic logging filter in finagle that can be "inserted anywhere" in a chain of andTh
  • Split one row into multiple rows of dataframe
  • Scala Tuple2Zipped vs IterableLike zip
  • How can i check for empty values on spark Dataframe using User defined functions
  • Import Scala object based on value of commandline argument
  • How to get the type parameters from an abstract class that is extended by an object
  • Databricks: Dataframe groupby agg, collector set include duplicate values
  • When to do .repartition(Int AnyValue) in Spark, right after reading the Parquet (or) after running computations on that
  • How can I get an empty collection of the same type as a given instance?
  • How to validate Date Column of dateframe
  • Tail recursion and call by name / value
  • shadow
    Privacy Policy - Terms - Contact Us © scrbit.com