head	1.5;
access;
symbols
	RELEASE_8_3_0:1.2
	RELEASE_9_0_0:1.2;
locks; strict;
comment	@# @;


1.5
date	2012.11.17.06.02.52;	author svnexp;	state Exp;
branches;
next	1.4;

1.4
date	2012.05.15.02.04.47;	author sunpoet;	state Exp;
branches;
next	1.3;

1.3
date	2012.04.09.17.43.00;	author sunpoet;	state Exp;
branches;
next	1.2;

1.2
date	2011.11.15.15.50.46;	author sunpoet;	state Exp;
branches;
next	1.1;

1.1
date	2011.11.12.16.45.51;	author sunpoet;	state Exp;
branches;
next	;


desc
@@


1.5
log
@Switch exporter over
@
text
@# New ports collection makefile for:	p5-Parse-HTTP-UserAgent
# Date created:		2011-11-13
# Whom:			Sunpoet Po-Chuan Hsieh <sunpoet@@FreeBSD.org>
#
# $FreeBSD: head/www/p5-Parse-HTTP-UserAgent/Makefile 300897 2012-07-14 14:29:18Z beat $
#

PORTNAME=	Parse-HTTP-UserAgent
PORTVERSION=	0.35
CATEGORIES=	www perl5
MASTER_SITES=	CPAN
MASTER_SITE_SUBDIR=	CPAN:BURAK
PKGNAMEPREFIX=	p5-

MAINTAINER=	sunpoet@@FreeBSD.org
COMMENT=	Parser for the User Agent string

TEST_DEPENDS=	p5-Test-Pod>=0:${PORTSDIR}/devel/p5-Test-Pod \
		p5-Test-Pod-Coverage>=0:${PORTSDIR}/devel/p5-Test-Pod-Coverage

PERL_CONFIGURE=	yes

MAN3=		Parse::HTTP::UserAgent.3 \
		Parse::HTTP::UserAgent::Base::Accessors.3 \
		Parse::HTTP::UserAgent::Base::Dumper.3 \
		Parse::HTTP::UserAgent::Base::IS.3 \
		Parse::HTTP::UserAgent::Base::Parsers.3 \
		Parse::HTTP::UserAgent::Constants.3

.include <bsd.port.mk>
@


1.4
log
@- Update to 0.35

Changes:	http://search.cpan.org/dist/Parse-HTTP-UserAgent/Changes
@
text
@d5 1
a5 1
# $FreeBSD$
@


1.3
log
@- Update to 0.34
- Add TEST_DEPENDS

Changes:	http://search.cpan.org/dist/Parse-HTTP-UserAgent/Changes
Feature safe:	yes
@
text
@d9 1
a9 1
PORTVERSION=	0.34
d12 1
@


1.2
log
@- Update to 0.33

Changes:	http://search.cpan.org/dist/Parse-HTTP-UserAgent/Changes
Feature safe:	yes
@
text
@d9 1
a9 1
PORTVERSION=	0.33
d17 3
@


1.1
log
@- Add p5-Parse-HTTP-UserAgent 0.32

Parse::HTTP::UserAgent implements a rules-based parser and tries to identify
MSIE, FireFox, Opera, Safari & Chrome first. It then tries to identify Mozilla,
Netscape, Robots and the rest will be tried with a generic parser. There is also
a structure dumper, useful for debugging.

User agent strings are a complete mess since there is no standard format for
them. They can be in various formats and can include more or less information
depending on the vendor's (or the user's) choice. Also, it is not dependable
since it is some arbitrary identification string. Any user agent can fake
another. So, why deal with such a useless mess? You may want to see the choice
of your visitors and can get some reliable data (even if some are fake) and
generate some nice charts out of them or just want to send an HttpOnly cookie if
the user agent seems to support it (and send a normal one if this is not the
case). However, browser sniffing for client-side coding is considered a bad
habit.

WWW: http://search.cpan.org/dist/Parse-HTTP-UserAgent/

Feature safe:	yes
@
text
@d9 1
a9 1
PORTVERSION=	0.32
@

