Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/build-and-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -147,8 +147,8 @@ jobs:
fail-fast: true
matrix:
perl-version:
- "5.14"
- "5.16"
#- "5.14"
#- "5.16"
#- "5.18" https://github.com/libwww-perl/WWW-Mechanize/runs/822568352?check_suite_focus=true
- "5.20"
- "5.22"
Expand Down
2 changes: 2 additions & 0 deletions Changes
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ Revision history for WWW::Mechanize
[ENHANCEMENTS]
- WWW::Mechanize no longer taints the responses it receives. This also
removes Test::Taint as a prerequisite.
[DOCUMENTATION]
- Improve FAQ (GH#189) (Julien Fiegehenn)

2.19 2024-09-16 15:25:45Z
[DOCUMENTATION]
Expand Down
11 changes: 11 additions & 0 deletions lib/WWW/Mechanize/FAQ.pod
Original file line number Diff line number Diff line change
Expand Up @@ -446,4 +446,15 @@ keeps a clone of the full Mech object at every step along the way.
You can limit this stack size with the C<stack_depth> param in the C<new()>
constructor. If you set stack_size to 0, Mech will not keep any history.

=head2 How do I find all the links in a sub-section of a page?

If you want to find all the links between two specific parts of content in
page, you need to modify the content to remove everything else. This is a bit unorthodox,
but it works.

my $html = $mech->content;
$html =~ s/.*(\n[^n]+foo.*bar[^\n]+).*/$1/ms;
$mech->update_html( $html );
$mech->dump_links;

=cut
10 changes: 5 additions & 5 deletions t/dump.t
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ subtest "dump_headers", sub {
};

subtest "dump_links test", sub {
dump_tests( 'dump_links', 't/find_link.html', <<'EXPECTED');
dump_tests( 'dump_links', 't/find_link.html', <<'EXPECTED' );
http://www.drphil.com/
HTTP://WWW.UPCASE.COM/
styles.css
Expand Down Expand Up @@ -58,7 +58,7 @@ EXPECTED
};

subtest "dump_images test", sub {
dump_tests( 'dump_images', 't/image-parse.html', <<'EXPECTED');
dump_tests( 'dump_images', 't/image-parse.html', <<'EXPECTED' );
/Images/bg-gradient.png
wango.jpg
bongo.gif
Expand All @@ -75,7 +75,7 @@ EXPECTED
};

subtest "dump_forms test", sub {
dump_tests( 'dump_forms', 't/form_with_fields.html', <<'EXPECTED');
dump_tests( 'dump_forms', 't/form_with_fields.html', <<'EXPECTED' );
POST http://localhost/ (multipart/form-data) [1st_form]
1a= (text)
1b= (text)
Expand Down Expand Up @@ -125,7 +125,7 @@ EXPECTED
};

subtest "dump_forms multiselect", sub {
dump_tests( 'dump_forms', 't/form_133_regression.html', <<'EXPECTED');
dump_tests( 'dump_forms', 't/form_133_regression.html', <<'EXPECTED' );
GET http://localhost/
select1=1 (option) [*1|2|3|4]
select2=1 (option) [*1|2|3|4]
Expand All @@ -144,7 +144,7 @@ EXPECTED
};

subtest "dump_text test", sub {
dump_tests( 'dump_text', 't/image-parse.html', <<'EXPECTED');
dump_tests( 'dump_text', 't/image-parse.html', <<'EXPECTED' );
Testing image extractionblargle And now, the dreaded wango CNN BBC Blongo!Logo
EXPECTED
};
Expand Down
2 changes: 1 addition & 1 deletion t/local/overload.t
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ BEGIN {
use_ok('WWW::Mechanize');
}

my $server = LocalServer->spawn( html => <<'BROKEN_HTML');
my $server = LocalServer->spawn( html => <<'BROKEN_HTML' );
<html>
<head><title>Broken document</head>
<form>
Expand Down