Article 554PN getting a long text stream of a long web page

getting a long text stream of a long web page

by
Skaperen
from LinuxQuestions.org on (#554PN)
i have a web page that has lots of HTML and binary bytes in it that displays on Firefox as a plain-ole text file. if i access it with lynx it also comes out fine but it entirely pagified (i have to press space to get each page). curl gives me the HTML and binary characters.

what i really want is to get a continuous text stream of the whole thing. is there a tool that can do that? i have the HTML downloaded, so something that only works from a local file could be fine.latest?d=yIl2AUoC8zA latest?i=aN1Y58hOe0Y:yvJ44U69zKw:F7zBnMy latest?i=aN1Y58hOe0Y:yvJ44U69zKw:V_sGLiP latest?d=qj6IDK7rITs latest?i=aN1Y58hOe0Y:yvJ44U69zKw:gIN9vFwaN1Y58hOe0Y
External Content
Source RSS or Atom Feed
Feed Location https://feeds.feedburner.com/linuxquestions/latest
Feed Title LinuxQuestions.org
Feed Link https://www.linuxquestions.org/questions/
Reply 0 comments