Factor this!

Message boards : NFS Discussion : Factor this!
Message board moderation

AuthorMessage
Kevin

Joined: 24 May 10
Posts: 7
Credit: 1,956
RAC: 0
Message 469 - Posted: 24 May 2010, 10:07:12 UTC

682934139764871621059294670540409779256277785405904002174839913653988630704152088904136121991617552380023455259399723952023653153643548060833653509153580215597699356690840851264278901841510478481653190812472058220200188105772756389172416253
ID: 469 · Rating: 0 · rate: /
Kevin

Joined: 24 May 10
Posts: 7
Credit: 1,956
RAC: 0
Message 471 - Posted: 24 May 2010, 20:52:04 UTC - in response to Message 470.

1. Random generation using random primes (Factors = p96 * p145)
2. Wait, wait, wait.. you only factor numbers of the form b^n + 1? By that reasoning, ask that question to everyone participating. Sorry for trying to contribute to the site.. Sheesh. O_O
ID: 471 · Rating: 0 · rate: /
Greg

Joined: 26 Jun 08
Posts: 594
Credit: 251,957,170
RAC: 31,853
Message 472 - Posted: 25 May 2010, 1:04:18 UTC

Kevin,

No offense is intended, at least from me anyway :-), but there are a large number of unfactored numbers that are of interest to mathematicians, computer scientists, and laymen that will keep us busy for a very long time. We are mainly working on the Cunningham project numbers for now, but there are many other candidates of interest. Factoring a randomly generated semiprime with no special significance is not a good use of the volunteer computer time.

Greg
ID: 472 · Rating: 0 · rate: /
bdodson*

Joined: 2 Oct 09
Posts: 50
Credit: 111,128,218
RAC: 0
Message 473 - Posted: 25 May 2010, 23:46:59 UTC - in response to Message 471.

1. Random generation using random primes (Factors = p96 * p145)
2. Wait, wait, wait.. you only factor numbers of the form b^n + 1? ...

There must be some range in between "random numbers, hard to factor" and
"numbers of the form b^k-1" (including ones b^2n -1 = (b^n -1)(b^n +1)).

You have a c240 or maybe c241 (96+145). Is the intended point that it's
no easier to factor than a prospective RSA number p*q, with p and q both
primes with 120-digits? Since the current record for RSA numbers is
RSA768, 233-decimal digits; factoring the number you've posed isn't
much different than asking for a new record factorization of an RSA number,
with 240-digits. I don'thave know your background, but have you looked at
the resources needed for setting some of the previous records? Numbers
of comparable difficulty (size, for gnfs) with the snfs numbers we're
working on here might have 180-digits; with 170-digits not-so-difficult,
and 190-digits harder than our current resources.

Or perhaps you intend factoring your number to be easier than factoring
a 240-digit RSA-key due to the smaller p96 factor? There's already a
challenge number of this type, with 1024-bits (c. 310-decimal), which is
known to be of the form p1*p2*p3*p4, where each pi is a prime (i = 1,2,3,4)
with 256-bits. Factoring this previous challenge number by a method easier
than factoring a 1024-bit RSA-key requires a method, easier than sieving
(gnfs or snfs) that would be able to make use of having a 77-digit prime
factor (c. 256-bits). Perhaps this previous challenge is easier than yours
(p77 instead of p96)? The person proposing this earlier challenge heads one
of the top factoring research labs, and they have made some progress by finding
p73 factors; setting a record for ecm, with the slight issue that the 73-digit
primes are quite far from being random; uhm, guess you're not going to be
impressed, their primes all divide numbers 2^n -1. If you want random, the
best we can do is 68-digits.

-bdodson
ID: 473 · Rating: 0 · rate: /
Kevin

Joined: 24 May 10
Posts: 7
Credit: 1,956
RAC: 0
Message 478 - Posted: 6 Jun 2010, 18:43:04 UTC

"No offense is intended, at least from me anyway :-)"

Excellent! :)

"but there are a large number of unfactored numbers that are of interest to mathematicians, computer scientists, and laymen that will keep us busy for a very long time"

The Cunningham project? You could just add to it by contributing a random b^n + 1 or a^n + b^n Ex: 5^300 + 1, or 6^429 + 11^429.

"We are mainly working on the Cunningham project numbers for now, but there are many other candidates of interest."

I think there are somewhere between 730 and 1200 of those.. That's going to keep you busy for a long time indeed, and that's not even factoring in the new a^n + b^n you contribute to the site each and every day.

"Factoring a randomly generated semiprime with no special significance is not a good use of the volunteer computer time. "

No special significance? That's outrageous! Perhaps, at some point, the number I posted is a cofactor of a number of the form a^n + b^n, or a^n + 1. That seems significant to me. Perhaps it's the cofactor of a number n where n^2 + 1 is prime. Perhaps it's the cofactor of a fibonacci number. The number can also be a cofactor to a Lucas number, the cofactor to an n^2-1 number, the cofactor to an n^2 + 1 number, among many (hypothetical) other properties that I can list. That doesn't seem "insignificant" to me. And, by that logic, apply the same reasoning to your own members, as I said before. Your reasoning, turned on you: "Factoring a randomly generated a^n + b^n number is not a good use of volunteer computer time."

"You have a c240 or maybe c241 (96+145). Is the intended point that it's
no easier to factor than a prospective RSA number p*q, with p and q both
primes with 120-digits?"

It's c240. And, that point would be false, a difference in size between the prime factors makes it easier to factor. That can be determined by common sense alone.

"Since the current record for RSA numbers is
RSA768, 233-decimal digits; factoring the number you've posed isn't
much different than asking for a new record factorization of an RSA number,
with 240-digits."

But this would not count, as it would be too easy (Because of the difference in size of the prime factors in question)

"I don'thave know your background, but have you looked at
the resources needed for setting some of the previous records?"

All previous records and RSA-768 were all factored using SNFS or GNFS.

" Numbers
of comparable difficulty (size, for gnfs) with the snfs numbers we're
working on here might have 180-digits; with 170-digits not-so-difficult,
and 190-digits harder than our current resources."

You might as well begin trying to break someone else's RSA keys. (155 digits, or a typical 512-bit RSA modulus, would be incredibly easy to do, by that reasoning.)

challenge number of this type, with 1024-bits (c. 310-decimal), which is
known to be of the form p1*p2*p3*p4, where each pi is a prime (i = 1,2,3,4)
with 256-bits. Factoring this previous challenge number by a method easier
than factoring a 1024-bit RSA-key requires a method, easier than sieving
(gnfs or snfs) that would be able to make use of having a 77-digit prime
factor (c. 256-bits). Perhaps this previous challenge is easier than yours

Four 77-digit numbers? Wouldn't this be easier by searching for factors in the range of 10^76 or 10^77? And why not use GNFS/SNFS? It's your best method for numbers of that size. (It is unlikely that there will be any methods better than GNFS/SNFS for factoring.) And as for it being an easier challenge, I would be inclined to agree that it is easier, but it's four 77-digit numbers. Repeating the process to find three of the four factors? That will take you a few months. Factoring p96 * p145 would be somewhat easier given the difference in factor size, and the fact that you only need to find one factor out of the two to complete the factorization. But it is 19 digits larger, so I'm guessing that the two factorizations would take an equal amount of time.

"The person proposing this earlier challenge heads one
of the top factoring research labs, and they have made some progress by finding
p73 factors; setting a record for ecm, with the slight issue that the 73-digit
primes are quite far from being random"

Hmm. Interesting. I think that a p80 might be found by ECM sometime soon. (By sometime soon, I mean 10 � 5 years.)

And, of course, it was found from 2^1181-1.

"uhm, guess you're not going to be
impressed, their primes all divide numbers 2^n -1. If you want random, the
best we can do is 68-digits."

This is well-known, and.. 68 digits? I can do that in 5 minutes!
ID: 478 · Rating: 0 · rate: /
Greg

Joined: 26 Jun 08
Posts: 594
Credit: 251,957,170
RAC: 31,853
Message 479 - Posted: 6 Jun 2010, 23:40:36 UTC - in response to Message 478.

a^n + b^n

Those are the homogeneous Cunningham numbers. See http://www.leyland.vispa.com/numth/factorization/anbn/main.htm

No special significance? That's outrageous! Perhaps, at some point, the number I posted is a cofactor of a number of the form ...

Perhaps, but likely not.

"You have a c240 or maybe c241 (96+145).
It's c240. And, that point would be false, a difference in size between the prime factors makes it easier to factor. That can be determined by common sense alone ... But this would not count, as it would be too easy (Because of the difference in size of the prime factors in question)

Actually, in this case "common sense" is leading you astray. If the smaller number was 60-65 digits or less, then ECM would make factoring the number much easier. However, a C240 = P96 * P145 is no easier than a C240 = P120 * P120. And if the number has no special form, then it is harder than the current record GNFS factorization.

You might as well begin trying to break someone else's RSA keys. (155 digits, or a typical 512-bit RSA modulus, would be incredibly easy to do, by that reasoning.)

And in fact it is. That is a large part of the point of this project. RSA-512 is totally broken. There has been one successful factorization of RSA-768, and we hope to improve our software to be able to factor more numbers of that size soon.

Four 77-digit numbers? Wouldn't this be easier by searching for factors in the range of 10^76 or 10^77?

No. There are approximately 5*10^74 77-digit prime numbers. If you could test one every clock cycle (which is nowhere near possible) a 2GHz computer would take 8*10^57 CPU-years, or about 10^47 times the age of the universe.

And why not use GNFS/SNFS? It's your best method for numbers of that size.

That size is currently out of reach of GNFS. But there are people working on it.

This is well-known, and.. 68 digits? I can do that in 5 minutes!

He is referring to a factor of 68-digits, not a composite of 68-digits. The current world record ECM factor of a number of a form other than 2^n +/- 1 is 68-digits.
ID: 479 · Rating: 0 · rate: /
bdodson*

Joined: 2 Oct 09
Posts: 50
Credit: 111,128,218
RAC: 0
Message 480 - Posted: 7 Jun 2010, 0:15:50 UTC - in response to Message 478.

...
The Cunningham project? You could just add to it by contributing a random b^n + 1 or a^n + b^n Ex: 5^300 + 1, or 6^429 + 11^429.
...
I think there are somewhere between 730 and 1200 of those.. That's going to keep you busy for a long time indeed, and that's not even factoring in the new a^n + b^n you contribute to the site each and every day.

OK, you're officially wasting the time of the people reading
your posts. The numbers on each of the 18 Cunningham lists have
specific limits on n, given in terms of b. There have been no
new numbers added for some years; and the most recent "official"
count was 458 on April 10. (I'm kidding on the first official, not
the second; I'm not an official on either the Cunningham project
or NFS@Home.)

...
"You have a c240 or maybe c241 (96+145). Is the intended point that it's
no easier to factor than a prospective RSA number p*q, with p and q both
primes with 120-digits?"

It's c240. And, that point would be false, a difference in size between the prime factors makes it easier to factor. That can be determined by common sense alone.

One of us lacks an elementary education in the algorithms of
modern computational number theory.

"Since the current record for RSA numbers is
RSA768, 233-decimal digits; factoring the number you've posed isn't
much different than asking for a new record factorization of an RSA number,
with 240-digits."

But this would not count, as it would be too easy (Because of the difference in size of the prime factors in question)

Again.

"I don't ... know your background, but have you looked at
the resources needed for setting some of the previous records?"

All previous records and RSA-768 were all factored using SNFS or GNFS.

False. I'm a coauthor of the paper on the factorization of RSA120;
we used QS (cf. MPQS, ppmpqs, ppmpqs with larger large primes). Likewise
RSA100 and RSA110. Perhaps the most famous of all record factorizations
was the original challenge number of Rivest-Addleman-Shamir, which appeared
in a famous math puzzle column in Scientic American, the first public
description of the RSA cryptographic system, RSA129. Also done by QS (with
ppmpqs and larger large primes). So we now know, beyond any doubt the
state of your knowledge of previous records

" Numbers
of comparable difficulty (size, for gnfs) with the snfs numbers we're
working on here might have 180-digits; with 170-digits not-so-difficult,
and 190-digits harder than our current resources."

You might as well begin trying to break someone else's RSA keys. (155 digits, or a typical 512-bit RSA modulus, would be incredibly easy to do, by that reasoning.)

Again. I did. As reported in the paper on the factorization of RSA155,
Eurocrypt 2000. More than 10 years ago (with factors found in 1999). That
was state-of-the-art at that time; just as RSA768 is the current
state-of-the-art.

What is the point of this comment? Yes, 512-bits, 155-decimal digits
is easy now. My statement (the one quoted) is a correct description
of the current resources of NFS@Home; which are somewhat larger than
the projects of Batalov+Dodson, where we recently did a c180 gnfs.

challenge number of this type, with 1024-bits (c. 310-decimal), which is
known to be of the form p1*p2*p3*p4, where each pi is a prime (i = 1,2,3,4)
with 256-bits. Factoring this previous challenge number by a method easier
than factoring a 1024-bit RSA-key requires a method, easier than sieving
(gnfs or snfs) that would be able to make use of having a 77-digit prime
factor (c. 256-bits). Perhaps this previous challenge is easier than yours

Four 77-digit numbers? Wouldn't this be easier by searching for factors in the range of 10^76 or 10^77? And why not use GNFS/SNFS? It's your best method for numbers of that size. (It is unlikely that there will be any methods better than GNFS/SNFS for factoring.)

Again. An undergraduate class in elementary number theory would usually
include at least the statement of the prime number theorem (I advised an
undergrad honors project in which the student presented the proof; currently
pursuing graduate studies at Univ Wisc.). There are too many primes smaller
than 10^70 to search them. More than the number of atoms in the universe.
You're decribing a computation with exponential runtime. You've taken
calculus? All of QS, ECM and NFS are subexponential methods. ECM has
runtime that does depend on the size of the smallest prime factor. The
record size prime is 68-digits for general numbers, 73-digits for 2^n-1.
Not so QS and NFS, each of which would take weeks to find a factor of 3
in a 180-digit composite. This was reported in Time magazine in 1983; it's
not even number theory anymore, basic knowledge of educated people.

I'll try again. Your view of the effect of the "difference in size of
prime factors in question" is not the view of a talented amateur. You
may very well be bright, but your comments do more to confirm that
"a little knowledge is a dangerous thing" than to a positive contribution
to our project. Unofficially speaking.

-bdodson

And as for it being an easier challenge, I would be inclined to agree that it is easier, but it's four 77-digit numbers. Repeating the process to find three of the four factors? That will take you a few months. Factoring p96 * p145 would be somewhat easier given the difference in factor size, and the fact that you only need to find one factor out of the two to complete the factorization. But it is 19 digits larger, so I'm guessing that the two factorizations would take an equal amount of time.

"The person proposing this earlier challenge heads one
of the top factoring research labs, and they have made some progress by finding
p73 factors; setting a record for ecm, with the slight issue that the 73-digit
primes are quite far from being random"

Hmm. Interesting. I think that a p80 might be found by ECM sometime soon. (By sometime soon, I mean 10 � 5 years.)

And, of course, it was found from 2^1181-1.

"uhm, guess you're not going to be
impressed, their primes all divide numbers 2^n -1. If you want random, the
best we can do is 68-digits."

This is well-known, and.. 68 digits? I can do that in 5 minutes! [/color]

ID: 480 · Rating: 0 · rate: /
Kevin

Joined: 24 May 10
Posts: 7
Credit: 1,956
RAC: 0
Message 481 - Posted: 7 Jun 2010, 0:58:25 UTC

"Those are the homogeneous Cunningham numbers. See http://www.leyland.vispa.com/numth/factorization/anbn/main.htm "

Is this where there is a list of 732 of them?

"Perhaps, but likely not."

Nope. There's a 100% chance that it has at least one of the properties I listed.

"Actually, in this case "common sense" is leading you astray. If the smaller number was 60-65 digits or less, then ECM would make factoring the number much easier. However, a C240 = P96 * P145 is no easier than a C240 = P120 * P120. And if the number has no special form, then it is harder than the current record GNFS factorization."

If the smaller factor were in fact a c31 to c36, any average joe would be able to factor it in about an hour using ECM. How is it equally difficult? A difference in size of 49 digits makes it in fact far easier than p120 * 120. Think of it this way: Suppose you had to sift through 96 objects to find a certain object out of 240 in one pile, and you had to sift through 120 objects in another pile. It'll be easier to sift through 96 objects than 120 objects. In fact, it would be faster attempting to pick out a p96 factor than it would be trying to pick out a p120 factor. The point of that (somewhat mediocre) analogy is that semiprimes whose factors are of identical size are more difficult to factor. That's the main reason why RSA moduli are generated using primes of similar/identical size. Because it's more difficult to factor a semiprime composed of primes of similar or identical size than it is to factor a semiprime whose prime factors vary in size. Because the running time needed for a factoring algorithm depends on the size of the smallest prime factor (Almost all (if not all) factoring methods work this way.). You can't dismiss the painfully obvious. :|

"And in fact it is. That is a large part of the point of this project. RSA-512 is totally broken. There has been one successful factorization of RSA-768, and we hope to improve our software to be able to factor more numbers of that size soon."

RSA-768 style numbers? I think you should give it 15 � 10 years.

"That size is currently out of reach of GNFS. But there are people working on it."

If they're not using SNFS, I don't know what it is that they're using.. Fermat's, Dixon's, Trial division, MPQS, QS, SIQS are not recommended for p77 factors...

"He is referring to a factor of 68-digits, not a composite of 68-digits. The current world record ECM factor of a number of a form other than 2^n +/- 1 is 68-digits."

Is this referring to a semiprime? If not, can you list ECM's world record for that?

ID: 481 · Rating: 0 · rate: /
Greg

Joined: 26 Jun 08
Posts: 594
Credit: 251,957,170
RAC: 31,853
Message 482 - Posted: 7 Jun 2010, 4:54:10 UTC - in response to Message 481.

A difference in size of 49 digits makes it in fact far easier than p120 * 120. Think of it this way: Suppose you had to sift through 96 objects to find a certain object out of 240 in one pile, and you had to sift through 120 objects in another pile. It'll be easier to sift through 96 objects than 120 objects. In fact, it would be faster attempting to pick out a p96 factor than it would be trying to pick out a p120 factor. The point of that (somewhat mediocre) analogy is that semiprimes whose factors are of identical size are more difficult to factor. That's the main reason why RSA moduli are generated using primes of similar/identical size. Because it's more difficult to factor a semiprime composed of primes of similar or identical size than it is to factor a semiprime whose prime factors vary in size. Because the running time needed for a factoring algorithm depends on the size of the smallest prime factor (Almost all (if not all) factoring methods work this way.). You can't dismiss the painfully obvious. :|

Just for the record for others reading this thread, this is simply wrong. :) The run time for some algorithms do mostly depend on the size of the penultimate factor, such as trial factoring, P+/-1, and ECM, and others depend only on the size of the number to be factored and not on the size of the factors, such as QS, SNFS, and GNFS. Even having equal size factors that are too close together can make the number easy to factor by the Fermat method. For the case of P96 * P145, it is in fact no easier than P120 * P120. Now back to our regularly scheduled sieving...
ID: 482 · Rating: 0 · rate: /
Kevin

Joined: 24 May 10
Posts: 7
Credit: 1,956
RAC: 0
Message 483 - Posted: 7 Jun 2010, 12:01:49 UTC

I took the time to research.. It turns out I was incorrect on all counts. A sad day for me. The BOINC program slowed my computer to a crawl, to be honest..
ID: 483 · Rating: 0 · rate: /
Kevin

Joined: 24 May 10
Posts: 7
Credit: 1,956
RAC: 0
Message 484 - Posted: 7 Jun 2010, 12:04:20 UTC

"Even having equal size factors that are too close together can make the number easy to factor by the Fermat method."

True.. But that's only if it's within sqrt(n)- (n^(1/4))

Also, there are non-math means to break an RSA key. (Ex: Timing attacks)
ID: 484 · Rating: 0 · rate: /

Message boards : NFS Discussion : Factor this!

Home | My Account | Message Boards